51Testing软件测试论坛

 找回密码
 (注-册)加入51Testing

QQ登录

只需一步,快速开始

微信登录,快人一步

手机号码,快捷登录

查看: 654|回复: 1
打印 上一主题 下一主题

求助:使用 appcrawler,老是报错,报错信息:javax.xml.transform.TransformerExce...

[复制链接]
  • TA的每日心情
    无聊
    7 小时前
  • 签到天数: 528 天

    连续签到: 1 天

    [LV.9]测试副司令

    跳转到指定楼层
    1#
    发表于 2021-2-24 10:17:47 | 只看该作者 回帖奖励 |倒序浏览 |阅读模式
    1测试积点
    求助:使用 appcrawler,老是报错,报错信息:javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’,要怎么解决报错信息
    1. AutomationSuite:
    2. 2020-04-22 00:15:56 INFO [AutomationSuite.13.beforeAll] beforeAll
    3. 2020-04-22 00:15:56 INFO [AutomationSuite.21.$anonfun$new$1] testcase start
    4. 2020-04-22 00:15:56 INFO [AutomationSuite.28.$anonfun$new$2] Step(null,null,null,跳过,click,null,0)
    5. 2020-04-22 00:15:56 INFO [AutomationSuite.31.$anonfun$new$2] 跳过
    6. 2020-04-22 00:15:56 INFO [AutomationSuite.32.$anonfun$new$2] click
    7. 2020-04-22 00:15:56 INFO [Crawler.996.doElementAction] current element = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
    8. 2020-04-22 00:15:56 INFO [Crawler.997.doElementAction] current index = 1
    9. 2020-04-22 00:15:56 INFO [Crawler.998.doElementAction] current action = click
    10. 2020-04-22 00:15:56 INFO [Crawler.999.doElementAction] current xpath = //[@resource-id="com.ykkg.lz:id/action_bar_root"]//[@resource-id="android:id/content"]//[@resource-id="com.ykkg.lz:id/rl_root"]//[@text="跳过" and @resource-id="com.ykkg.lz:id/tv_jump"]
    11. 2020-04-22 00:15:56 INFO [Crawler.1000.doElementAction] current url = Steps
    12. 2020-04-22 00:15:56 INFO [Crawler.1001.doElementAction] current tag path = hierarchy/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.RelativeLayout/android.widget.TextView
    13. 2020-04-22 00:15:56 INFO [Crawler.1002.doElementAction] current file name = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
    14. 2020-04-22 00:15:56 INFO [AppCrawler$.59.saveReqHash] save reqHash to 1
    15. 2020-04-22 00:15:56 INFO [AppCrawler$.92.saveReqImg] save reqImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png to 1
    16. 2020-04-22 00:15:56 INFO [AppCrawler$.76.saveReqDom] save reqDom to 1
    17. 2020-04-22 00:15:56 INFO [Crawler.1071.doElementAction] need input click
    18. 2020-04-22 00:15:56 INFO [AppiumClient.53.findElementByURI] find by uri element= Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
    19. 2020-04-22 00:15:56 INFO [AppiumClient.245.findElementsByURI] findElementByAndroidUIAutomator new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")
    20. 2020-04-22 00:15:56 INFO [AppiumClient.60.findElementByURI] find by xpath success
    21. 2020-04-22 00:15:56 INFO [Crawler.1080.doElementAction] mark 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
    22. 2020-04-22 00:15:56 INFO [AppiumClient.141.mark] read from 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
    23. 2020-04-22 00:15:59 INFO [AppiumClient.154.mark] write png 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
    24. 2020-04-22 00:15:59 INFO [AppiumClient.161.mark] ImageIO.write newImageName 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
    25. 2020-04-22 00:16:00 INFO [Crawler.1095.$anonfun$doElementAction$5] click element
    26. 2020-04-22 00:16:00 INFO [AppiumClient.174.click] [[io.appium.java_client.android.AndroidDriver, Capabilities: {app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, appium=http://127.0.0.1:4723/wd/hub, databaseEnabled=false, desired={platformName=android, appium=http://127.0.0.1:4723/wd/hub, app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, deviceName=demo, fullReset=false, noReset=true}, deviceApiLevel=22, deviceManufacturer=vivo, deviceModel=vivo X7, deviceName=db20dbf7, deviceScreenDensity=480, deviceScreenSize=1080x1920, deviceUDID=db20dbf7, fullReset=false, javascriptEnabled=true, locationContextEnabled=false, networkConnectionEnabled=true, noReset=true, pixelRatio=3, platform=LINUX, platformName=Android, platformVersion=5.1.1, statBarHeight=72, takesScreenshot=true, viewportRect={left=0, top=72, width=1080, height=1848}, warnings={}, webStorageEnabled=false}] -> -android uiautomator: new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")]
    27. 2020-04-22 00:16:02 INFO [Crawler.1126.doElementAction] mark image exist
    28. 2020-04-22 00:16:02 INFO [Crawler.1130.doElementAction] sleep 500 for loading
    29. 2020-04-22 00:16:03 INFO [Crawler.627.refreshPage] refresh page
    30. 2020-04-22 00:16:03 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
    31. 2020-04-22 00:16:03 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
    32. 2020-04-22 00:16:03 INFO [Crawler.645.parsePageContext] appName =
    33. 2020-04-22 00:16:03 INFO [Crawler.649.parsePageContext] url=MainActivity
    34. 2020-04-22 00:16:04 INFO [Crawler.673.parsePageContext] currentContentHash=150a5c764690a3d0af3e1fffdab3c011 lastContentHash=d10e6132ee8d3b2bd7fa6854da178699
    35. 2020-04-22 00:16:04 INFO [Crawler.675.parsePageContext] ui change
    36. 2020-04-22 00:16:04 INFO [Crawler.931.saveDom] save to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.dom
    37. 2020-04-22 00:16:04 INFO [Crawler.953.saveScreen] start screenshot
    38. 2020-04-22 00:16:04 INFO [Crawler.956.$anonfun$saveScreen$2] ui change screenshot again
    39. 2020-04-22 00:16:05 INFO [Crawler.977.saveScreen] screenshot success
    40. 2020-04-22 00:16:05 INFO [AppCrawler$.67.saveResHash] save resHash to 1
    41. 2020-04-22 00:16:05 INFO [AppCrawler$.101.saveResImg] save resImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.clicked.png to 1
    42. 2020-04-22 00:16:05 INFO [AppCrawler$.84.saveResDom] save resDom to 1
    43. 2020-04-22 00:16:05 INFO [AutomationSuite.66.$anonfun$new$1] finish run steps

    44. run steps 2020-04-22 00:16:05 INFO [Crawler.627.refreshPage] refresh page 2020-04-22 00:16:05 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium 2020-04-22 00:16:06 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format 2020-04-22 00:16:06 INFO [Crawler.645.parsePageContext] appName = 2020-04-22 00:16:06 INFO [Crawler.649.parsePageContext] url=MainActivity 2020-04-22 00:16:06 INFO [Crawler.673.parsePageContext] currentContentHash=3d9930cd9d5328c64b4fef63ed06d2ba lastContentHash=150a5c764690a3d0af3e1fffdab3c011 2020-04-22 00:16:06 INFO [Crawler.675.parsePageContext] ui change 2020-04-22 00:16:06 INFO [Crawler.1213.handleCtrlC] add shutdown hook 2020-04-22 00:16:06 INFO [Crawler.772.crawl]
    45. crawl next
    46. 2020-04-22 00:16:06 INFO [Crawler.425.needReturn] urlStack=Stack(MainActivity) baseUrl=List() maxDepth=10
    47. 2020-04-22 00:16:06 INFO [Crawler.834.crawl] no need to back
    48. 2020-04-22 00:16:06 INFO [Crawler.487.getAvailableElement] selected nodes size = 9
    49. 2020-04-22 00:16:06 ERROR [Crawler.193.crawl] crawl not finish, return with exception
    50. 2020-04-22 00:16:06 ERROR [Crawler.194.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
    51. 2020-04-22 00:16:06 ERROR [Crawler.195.crawl] TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
    52. 2020-04-22 00:16:06 ERROR [Crawler.196.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
    53. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
    54. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] [wrapped] javax.xml.xpath.XPathExpressionException: javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
    55. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.error(XPathParser.java:612)
    56. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1603)
    57. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
    58. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
    59. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
    60. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
    61. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
    62. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
    63. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
    64. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:896)
    65. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
    66. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:842)
    67. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
    68. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
    69. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PredicateExpr(XPathParser.java:1956)
    70. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Predicate(XPathParser.java:1938)
    71. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Step(XPathParser.java:1728)
    72. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelativeLocationPath(XPathParser.java:1628)
    73. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1599)
    74. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
    75. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
    76. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
    77. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
    78. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
    79. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
    80. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
    81. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
    82. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
    83. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
    84. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.initXPath(XPathParser.java:131)
    85. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:180)
    86. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:268)
    87. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.jaxp.XPathImpl.compile(XPathImpl.java:390)
    88. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXML(XPathUtil.scala:167)
    89. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXPath(XPathUtil.scala:183)
    90. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListByKey(XPathUtil.scala:271)
    91. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4(Crawler.scala:493)
    92. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4$adapted(Crawler.scala:491)
    93. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.immutable.List.foreach(List.scala:389)
    94. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach(TraversableForwarder.scala:35)
    95. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach$(TraversableForwarder.scala:35)
    96. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:44)
    97. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.getAvailableElement(Crawler.scala:491)
    98. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:840)
    99. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$crawl$1(Crawler.scala:187)
    100. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
    101. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.util.Try$.apply(Try.scala:209)
    102. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:187)
    103. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.start(Crawler.scala:170)
    104. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.startCrawl(AppCrawler.scala:322)
    105. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.parseParams(AppCrawler.scala:290)
    106. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.main(AppCrawler.scala:91)
    107. 2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler.main(AppCrawler.scala)
    108. 2020-04-22 00:16:06 ERROR [Crawler.198.crawl] create new session
    109. 2020-04-22 00:16:06 INFO [Crawler.214.restart] execute shell on restart
    110. 2020-04-22 00:16:06 INFO [Crawler.217.restart] restart appium
    111. 2020-04-22 00:16:06 INFO [Crawler.250.setupAppium] afterPageMax=2
    112. 2020-04-22 00:16:06 INFO [Crawler.273.setupAppium] use AppiumClient
    复制代码
    我确实把 “行情” 加入到 blacklist 里面去了,不知道是不是这个原因

    分享到:  QQ好友和群QQ好友和群 QQ空间QQ空间 腾讯微博腾讯微博 腾讯朋友腾讯朋友
    收藏收藏
    回复

    使用道具 举报

  • TA的每日心情
    奋斗
    6 小时前
  • 签到天数: 1517 天

    连续签到: 1 天

    [LV.Master]测试大本营

    2#
    发表于 2021-2-25 11:05:45 | 只看该作者
    文本内容也可以的吧
    回复

    使用道具 举报

    本版积分规则

    关闭

    站长推荐上一条 /1 下一条

    小黑屋|手机版|Archiver|51Testing软件测试网 ( 沪ICP备05003035号 关于我们

    GMT+8, 2024-11-18 16:55 , Processed in 0.073260 second(s), 21 queries .

    Powered by Discuz! X3.2

    © 2001-2024 Comsenz Inc.

    快速回复 返回顶部 返回列表