51Testing软件测试论坛

标题: 求助:使用 appcrawler,老是报错,报错信息 [打印本页]

作者: 测试积点老人    时间: 2020-9-1 16:18
标题: 求助:使用 appcrawler,老是报错,报错信息
javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’,要怎么解决报错信息
  1. AutomationSuite:
  2. 2020-04-22 00:15:56 INFO [AutomationSuite.13.beforeAll] beforeAll
  3. 2020-04-22 00:15:56 INFO [AutomationSuite.21.$anonfun$new$1] testcase start
  4. 2020-04-22 00:15:56 INFO [AutomationSuite.28.$anonfun$new$2] Step(null,null,null,跳过,click,null,0)
  5. 2020-04-22 00:15:56 INFO [AutomationSuite.31.$anonfun$new$2] 跳过
  6. 2020-04-22 00:15:56 INFO [AutomationSuite.32.$anonfun$new$2] click
  7. 2020-04-22 00:15:56 INFO [Crawler.996.doElementAction] current element = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
  8. 2020-04-22 00:15:56 INFO [Crawler.997.doElementAction] current index = 1
  9. 2020-04-22 00:15:56 INFO [Crawler.998.doElementAction] current action = click
  10. 2020-04-22 00:15:56 INFO [Crawler.999.doElementAction] current xpath = //[@resource-id="com.ykkg.lz:id/action_bar_root"]//[@resource-id="android:id/content"]//[@resource-id="com.ykkg.lz:id/rl_root"]//[@text="跳过" and @resource-id="com.ykkg.lz:id/tv_jump"]
  11. 2020-04-22 00:15:56 INFO [Crawler.1000.doElementAction] current url = Steps
  12. 2020-04-22 00:15:56 INFO [Crawler.1001.doElementAction] current tag path = hierarchy/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.RelativeLayout/android.widget.TextView
  13. 2020-04-22 00:15:56 INFO [Crawler.1002.doElementAction] current file name = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
  14. 2020-04-22 00:15:56 INFO [AppCrawler$.59.saveReqHash] save reqHash to 1
  15. 2020-04-22 00:15:56 INFO [AppCrawler$.92.saveReqImg] save reqImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png to 1
  16. 2020-04-22 00:15:56 INFO [AppCrawler$.76.saveReqDom] save reqDom to 1
  17. 2020-04-22 00:15:56 INFO [Crawler.1071.doElementAction] need input click
  18. 2020-04-22 00:15:56 INFO [AppiumClient.53.findElementByURI] find by uri element= Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
  19. 2020-04-22 00:15:56 INFO [AppiumClient.245.findElementsByURI] findElementByAndroidUIAutomator new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")
  20. 2020-04-22 00:15:56 INFO [AppiumClient.60.findElementByURI] find by xpath success
  21. 2020-04-22 00:15:56 INFO [Crawler.1080.doElementAction] mark 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
  22. 2020-04-22 00:15:56 INFO [AppiumClient.141.mark] read from 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
  23. 2020-04-22 00:15:59 INFO [AppiumClient.154.mark] write png 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
  24. 2020-04-22 00:15:59 INFO [AppiumClient.161.mark] ImageIO.write newImageName 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
  25. 2020-04-22 00:16:00 INFO [Crawler.1095.$anonfun$doElementAction$5] click element
  26. 2020-04-22 00:16:00 INFO [AppiumClient.174.click] [[io.appium.java_client.android.AndroidDriver, Capabilities: {app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, appium=http://127.0.0.1:4723/wd/hub, databaseEnabled=false, desired={platformName=android, appium=http://127.0.0.1:4723/wd/hub, app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, deviceName=demo, fullReset=false, noReset=true}, deviceApiLevel=22, deviceManufacturer=vivo, deviceModel=vivo X7, deviceName=db20dbf7, deviceScreenDensity=480, deviceScreenSize=1080x1920, deviceUDID=db20dbf7, fullReset=false, javascriptEnabled=true, locationContextEnabled=false, networkConnectionEnabled=true, noReset=true, pixelRatio=3, platform=LINUX, platformName=Android, platformVersion=5.1.1, statBarHeight=72, takesScreenshot=true, viewportRect={left=0, top=72, width=1080, height=1848}, warnings={}, webStorageEnabled=false}] -> -android uiautomator: new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")]
  27. 2020-04-22 00:16:02 INFO [Crawler.1126.doElementAction] mark image exist
  28. 2020-04-22 00:16:02 INFO [Crawler.1130.doElementAction] sleep 500 for loading
  29. 2020-04-22 00:16:03 INFO [Crawler.627.refreshPage] refresh page
  30. 2020-04-22 00:16:03 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
  31. 2020-04-22 00:16:03 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
  32. 2020-04-22 00:16:03 INFO [Crawler.645.parsePageContext] appName =
  33. 2020-04-22 00:16:03 INFO [Crawler.649.parsePageContext] url=MainActivity
  34. 2020-04-22 00:16:04 INFO [Crawler.673.parsePageContext] currentContentHash=150a5c764690a3d0af3e1fffdab3c011 lastContentHash=d10e6132ee8d3b2bd7fa6854da178699
  35. 2020-04-22 00:16:04 INFO [Crawler.675.parsePageContext] ui change
  36. 2020-04-22 00:16:04 INFO [Crawler.931.saveDom] save to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.dom
  37. 2020-04-22 00:16:04 INFO [Crawler.953.saveScreen] start screenshot
  38. 2020-04-22 00:16:04 INFO [Crawler.956.$anonfun$saveScreen$2] ui change screenshot again
  39. 2020-04-22 00:16:05 INFO [Crawler.977.saveScreen] screenshot success
  40. 2020-04-22 00:16:05 INFO [AppCrawler$.67.saveResHash] save resHash to 1
  41. 2020-04-22 00:16:05 INFO [AppCrawler$.101.saveResImg] save resImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.clicked.png to 1
  42. 2020-04-22 00:16:05 INFO [AppCrawler$.84.saveResDom] save resDom to 1
  43. 2020-04-22 00:16:05 INFO [AutomationSuite.66.$anonfun$new$1] finish run steps

  44. run steps 2020-04-22 00:16:05 INFO [Crawler.627.refreshPage] refresh page 2020-04-22 00:16:05 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium 2020-04-22 00:16:06 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format 2020-04-22 00:16:06 INFO [Crawler.645.parsePageContext] appName = 2020-04-22 00:16:06 INFO [Crawler.649.parsePageContext] url=MainActivity 2020-04-22 00:16:06 INFO [Crawler.673.parsePageContext] currentContentHash=3d9930cd9d5328c64b4fef63ed06d2ba lastContentHash=150a5c764690a3d0af3e1fffdab3c011 2020-04-22 00:16:06 INFO [Crawler.675.parsePageContext] ui change 2020-04-22 00:16:06 INFO [Crawler.1213.handleCtrlC] add shutdown hook 2020-04-22 00:16:06 INFO [Crawler.772.crawl]
  45. crawl next
  46. 2020-04-22 00:16:06 INFO [Crawler.425.needReturn] urlStack=Stack(MainActivity) baseUrl=List() maxDepth=10
  47. 2020-04-22 00:16:06 INFO [Crawler.834.crawl] no need to back
  48. 2020-04-22 00:16:06 INFO [Crawler.487.getAvailableElement] selected nodes size = 9
  49. 2020-04-22 00:16:06 ERROR [Crawler.193.crawl] crawl not finish, return with exception
  50. 2020-04-22 00:16:06 ERROR [Crawler.194.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
  51. 2020-04-22 00:16:06 ERROR [Crawler.195.crawl] TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
  52. 2020-04-22 00:16:06 ERROR [Crawler.196.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
  53. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
  54. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] [wrapped] javax.xml.xpath.XPathExpressionException: javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
  55. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.error(XPathParser.java:612)
  56. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1603)
  57. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
  58. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
  59. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
  60. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
  61. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
  62. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
  63. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
  64. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:896)
  65. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
  66. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:842)
  67. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
  68. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
  69. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PredicateExpr(XPathParser.java:1956)
  70. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Predicate(XPathParser.java:1938)
  71. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Step(XPathParser.java:1728)
  72. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelativeLocationPath(XPathParser.java:1628)
  73. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1599)
  74. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
  75. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
  76. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
  77. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
  78. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
  79. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
  80. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
  81. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
  82. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
  83. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
  84. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.initXPath(XPathParser.java:131)
  85. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:180)
  86. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:268)
  87. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.jaxp.XPathImpl.compile(XPathImpl.java:390)
  88. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXML(XPathUtil.scala:167)
  89. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXPath(XPathUtil.scala:183)
  90. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListByKey(XPathUtil.scala:271)
  91. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4(Crawler.scala:493)
  92. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4$adapted(Crawler.scala:491)
  93. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.immutable.List.foreach(List.scala:389)
  94. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach(TraversableForwarder.scala:35)
  95. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach$(TraversableForwarder.scala:35)
  96. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:44)
  97. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.getAvailableElement(Crawler.scala:491)
  98. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:840)
  99. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$crawl$1(Crawler.scala:187)
  100. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
  101. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.util.Try$.apply(Try.scala:209)
  102. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:187)
  103. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.start(Crawler.scala:170)
  104. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.startCrawl(AppCrawler.scala:322)
  105. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.parseParams(AppCrawler.scala:290)
  106. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.main(AppCrawler.scala:91)
  107. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler.main(AppCrawler.scala)
  108. 2020-04-22 00:16:06 ERROR [Crawler.198.crawl] create new session
  109. 2020-04-22 00:16:06 INFO [Crawler.214.restart] execute shell on restart
  110. 2020-04-22 00:16:06 INFO [Crawler.217.restart] restart appium
  111. 2020-04-22 00:16:06 INFO [Crawler.250.setupAppium] afterPageMax=2
  112. 2020-04-22 00:16:06 INFO [Crawler.273.setupAppium] use AppiumClient
复制代码
我确实把“行情”加入到blacklist里面去了,不知道是不是这个原因


作者: 海海豚    时间: 2020-9-2 10:14
https://www.cnblogs.com/surewing ... utm_medium=referral  参考下这个
作者: 郭小贱    时间: 2020-9-2 13:01
appcrawler 是什么?第一次听说,学习了。
作者: bellas    时间: 2020-9-2 13:34
参考下这个链接https://testerhome.com/search?q=appium+%E8%B7%AF%E5%BE%84
作者: jingzizx    时间: 2020-9-2 14:20
这个是不是有问题,调试下看看




欢迎光临 51Testing软件测试论坛 (http://bbs.51testing.com/) Powered by Discuz! X3.2