[0m[[0m[0minfo[0m] [0m[0m[32m- query outside of layer bounds (18 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space (562 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space and time (396 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query at particular times (389 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mCoordinateSpaceTime query tests for z order by 6 months[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query outside of layer bounds (11 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space (393 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space and time (423 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query at particular times (455 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mCoordinateSpaceTime query tests for hilbert using now[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query outside of layer bounds (11 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space (388 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space and time (394 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query at particular times (347 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mCoordinateSpaceTime query tests for hilbert resolution[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query outside of layer bounds (15 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space (348 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query disjunction on space and time (273 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- query at particular times (261 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mupdating for z order by year[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer (1 second, 533 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (empty set) (3 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (keys out of bounds) (1 second, 427 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer with preset keybounds, new rdd not intersects already ingested (7 seconds, 138 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update correctly inside the bounds of a metatile (2 seconds, 783 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mupdating for z order by 6 months[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer (2 seconds, 66 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (empty set) (3 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (keys out of bounds) (1 second, 602 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer with preset keybounds, new rdd not intersects already ingested (7 seconds, 110 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update correctly inside the bounds of a metatile (2 seconds, 618 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mupdating for hilbert using now[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer (1 second, 951 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (empty set) (8 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (keys out of bounds) (1 second, 612 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer with preset keybounds, new rdd not intersects already ingested (8 seconds, 638 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update correctly inside the bounds of a metatile (2 seconds, 337 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mupdating for hilbert resolution[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer (1 second, 665 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (empty set) (6 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not update a layer (keys out of bounds) (1 second, 595 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update a layer with preset keybounds, new rdd not intersects already ingested (7 seconds, 228 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should update correctly inside the bounds of a metatile (2 seconds, 556 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mRDDCostDistanceMethodsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mCost-Distance Extension Methods[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- The costdistance Method Should Work (1/2) (3 seconds, 382 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- The costdistance Method Should Work (2/2) (2 seconds, 184 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mProjectedExtentRDDSplitMethodsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSplitting an RDD[(ProjectedExtent, Tile)][0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should split an example correctly (300 milliseconds)[0m[0m
14:31:02 TaskSetManager: Stage 0 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:31:02 TaskSetManager: Stage 1 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:31:02 TaskSetManager: Stage 3 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:31:03 TaskSetManager: Stage 4 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
[0m[[0m[0minfo[0m] [0m[0m[32mHillshadeSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHillshade Elevation Spec[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get the same result on elevation for spark op as single raster op (954 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get the same result on elevation for spark op as single raster op (collection api) (483 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMultiplySpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMultiply Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should multiply a constant by a raster (3 seconds, 977 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should multiply a raster by a constant (2 seconds, 379 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should multiply multiple rasters (3 seconds, 981 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should multiply multiple rasters as a seq (3 seconds, 262 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mLessOrEqualSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mLess Or Equal Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less or equal between an integer and a raster (2 seconds, 185 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less or equal right associative between an integer and a raster (2 seconds, 299 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less or equal between a double and a raster (2 seconds, 224 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less or equal right associative between a double and a raster (2 seconds, 79 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less or equal between two rasters (3 seconds, 162 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mGreaterSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mGreater Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check greater between an integer and a raster (2 seconds, 314 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check greater right associative between an integer and a raster (2 seconds, 707 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check greater between a double and a raster (2 seconds, 633 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check greater right associative between a double and a raster (2 seconds, 447 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check greater or equal between two rasters (2 seconds, 912 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mReorderedRDDSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should reorder partitions (1 second, 288 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should reorder to empty (15 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should reorder from empty (455 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSubtractSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSubtract Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should subtract a constant from a raster (2 seconds, 234 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should subtract from a constant, raster values (2 seconds, 212 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should subtract multiple rasters (3 seconds, 124 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should subtract multiple rasters as a seq (3 seconds, 294 milliseconds)[0m[0m
14:31:55 BlockManager: Putting block rdd_2_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif.
14:31:55 BlockManager: Block rdd_2_0 could not be removed as it was not found on disk or in memory
14:31:55 BlockManager: Putting block rdd_2_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif.
14:31:55 Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 BlockManager: Block rdd_2_1 could not be removed as it was not found on disk or in memory
14:31:55 Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
14:31:55 TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2, localhost, executor driver): TaskKilled (Stage cancelled)
14:31:55 TaskSetManager: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): TaskKilled (Stage cancelled)
14:31:55 BlockManager: Putting block rdd_5_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_0.tiff.
14:31:55 BlockManager: Block rdd_5_0 could not be removed as it was not found on disk or in memory
14:31:55 Executor: Exception in task 0.0 in stage 1.0 (TID 4)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_0.tiff
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 BlockManager: Putting block rdd_5_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_1.tiff.
14:31:55 BlockManager: Block rdd_5_1 could not be removed as it was not found on disk or in memory
14:31:55 Executor: Exception in task 1.0 in stage 1.0 (TID 5)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_1.tiff
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 TaskSetManager: Lost task 0.0 in stage 1.0 (TID 4, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_0.tiff
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:31:55 TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job
14:31:55 TaskSetManager: Lost task 1.0 in stage 1.0 (TID 5, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_1.tiff
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[0m[[0m[0minfo[0m] [0m[0m[32mHadoopIngestSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should allow filtering files in hadoopGeoTiffRDD *** FAILED *** (202 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should allow overriding tiff file extensions in hadoopGeoTiffRDD *** FAILED *** (139 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 4, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_0.tiff[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-tiff/test-200506000000_0_0.tiff[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHttpRangeReaderProviderSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHttpRangeReaderProviderSpec[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should create a HttpRangeReader from a URI *** FAILED *** (3 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m java.net.ConnectException: Connection refused (Connection refused)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.PlainSocketImpl.socketConnect(Native Method)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.Socket.connect(Socket.java:589)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at sun.net.NetworkClient.doConnect(NetworkClient.java:175)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at sun.net.www.http.HttpClient.<init>(HttpClient.java:242)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should dectect a bad URL (0 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should fail to parse URIs with non-http schemes (0 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHdfsRangeReaderProviderSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHdfsRangeReaderProviderSpec[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should create a HdfsRangeReader from a URI (20 milliseconds)[0m[0m
14:31:56 TaskSetManager: Stage 0 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:56 TaskSetManager: Stage 1 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:57 TaskSetManager: Stage 2 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:57 TaskSetManager: Stage 3 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:57 TaskSetManager: Stage 4 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:57 TaskSetManager: Stage 5 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:58 TaskSetManager: Stage 6 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:31:58 TaskSetManager: Stage 7 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
[0m[[0m[0minfo[0m] [0m[0m[32mZoomResampleMethodsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mZoom Resample on TileLayerRDD - aspect.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly crop by the rdd extent (238 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly increase the number of tiles by 2 when going up one level (396 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mZoom Resample on MultibandTileLayerRDD - aspect.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly crop by the rdd extent (198 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly increase the number of tiles by 2 when going up one level (420 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mZoom Resample on TileLayerRDD - manual example[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly resample and filter in a larger example (543 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mZoom Resample on MultibandTileLayerRDD - manual example[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly resample and filter in a larger example (502 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mTestCatalogSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mgeotrellis.store.TestCatalogSpec *** ABORTED *** (93 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m java.net.URISyntaxException: Illegal character in path at index 33: /jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/vlm/aspect-tiled.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3063)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.util.RangeReader$.apply(RangeReader.scala:65)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.raster.geotiff.GeoTiffRasterSource.$anonfun$tiff$1(GeoTiffRasterSource.scala:38)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.getOrElse(Option.scala:138)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.raster.geotiff.GeoTiffRasterSource.tiff$lzycompute(GeoTiffRasterSource.scala:37)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.raster.geotiff.GeoTiffRasterSource.tiff(GeoTiffRasterSource.scala:34)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSumSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSum Focal Spec[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum r = 1 for raster rdd (781 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum with 5x5 neighborhood rdd (713 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum with 5x5 neighborhood for NoData cells (607 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should circle sum for all cells (535 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum r = 1 for raster collection (350 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum with 5x5 neighborhood collection (366 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should circle sum for raster source collection (380 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should copy cells when no one satisfies the focus criteria (594 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum r = 1 for na cells (854 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square sum r = 1 for data cells (751 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should circle sum for data cells (720 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mClipToGridSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mClipToGrid[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should clip a point (383 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should clip a multipoint (216 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should clip a line (223 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should clip a line contained in one key (127 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should clip a polygon (187 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHadoopRasterMethodsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mwriting Rasters without errors and with correct tiles, crs and extent using Hadoop FSData{Input|Output} stream[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write GeoTiff with tags (482 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write GeoTiff with tags with gzip (887 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write Png (8 seconds, 132 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write Png with gzip (8 seconds, 412 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write Jpg (11 seconds, 110 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should write Jpg with gzip (9 seconds, 438 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMeanSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMean Zonal Summary Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over whole raster extent (852 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over whole raster extent for a MultibandTileRDD (1 second, 22 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over a quarter of the extent (1 second, 328 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over a quarter of the extent for a MultibandTileRDD (1 second, 402 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMean Zonal Summary Operation (collections api)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over whole raster extent (945 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over whole raster extent for MultibandTiles (993 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over a quarter of the extent (1 second, 135 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct mean over a quarter of the extent for MultibandTiles (1 second, 15 milliseconds)[0m[0m
14:32:55 TaskSetManager: Stage 0 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:32:55 TaskSetManager: Stage 1 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:32:56 TaskSetManager: Stage 3 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
14:32:56 TaskSetManager: Stage 4 contains a task of very large size (264 KB). The maximum recommended task size is 100 KB.
[0m[[0m[0minfo[0m] [0m[0m[32mAspectSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mAspect Elevation Spec[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should match gdal computed slope raster (965 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should match gdal computed slope raster (collections api) (496 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mPyramidSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mPyramid[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should work with SpaceTimeKey rasters (594 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should pyramid Bounds[SpatialKey] (77 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should pyramid floating layer (428 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should produce the expected result for pyramid levels (785 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHdfsUtilsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHdfsUtils[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not crash with unuseful error messages when no files match listFiles (3 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should read the wole file if given whole file length (41 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should return an Array[Byte] of the correct size (2 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should read the correct range of bytes from a file (1 millisecond)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mLocalSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mLocal Operations[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should local mask two rasters (3 seconds, 57 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should local inverse mask two rasters (3 seconds, 295 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should set all undefined values to 0 and the rest to one (2 seconds, 428 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should set all defined values to 0 and the rest to one (2 seconds, 339 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should square root all values in raster (2 seconds, 155 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should round all values in raster (2 seconds, 661 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should log all values in raster (2 seconds, 191 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should log base 10 all values in raster (2 seconds, 367 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should floor all values in raster (2 seconds, 257 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should ceil all values in raster (2 seconds, 182 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should negate all values in raster (2 seconds, 212 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should negate with unary operator all values in raster (2 seconds, 421 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should not all values in raster (2 seconds, 269 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should abs all values in raster (2 seconds, 820 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should arc cos all values in raster (2 seconds, 341 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should arc sin all values in raster (2 seconds, 441 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should arc tangent 2 all values in raster (3 seconds, 338 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should arc tan all values in raster (2 seconds, 404 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should cos all values in raster (2 seconds, 369 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should hyperbolic cos all values in raster (2 seconds, 196 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should sin all values in raster (2 seconds, 7 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should hyperbolic sin all values in raster (2 seconds, 230 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should tan all values in raster (1 second, 780 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should hyperbolic tan all values in raster (2 seconds, 204 milliseconds)[0m[0m
Reading points
Building Delaunay triangulations
Extracting BoundaryDelaunay objects
Forming baseline EuclideanDistanceTile
Forming stitched EuclideanDistance tile
Finished
SpatialKey(0,2) has 226 coordinates
SpatialKey(0,3) has 150 coordinates
SpatialKey(0,0) has 357 coordinates
SpatialKey(3,1) has 221 coordinates
SpatialKey(3,0) has 434 coordinates
SpatialKey(2,0) has 150 coordinates
SpatialKey(1,1) has 44 coordinates
SpatialKey(3,2) has 199 coordinates
SpatialKey(1,3) has 1 coordinates
SpatialKey(2,2) has 23 coordinates
SpatialKey(0,1) has 269 coordinates
SpatialKey(3,3) has 183 coordinates
SpatialKey(2,3) has 54 coordinates
SpatialKey(1,2) has 41 coordinates
SpatialKey(2,1) has 29 coordinates
SpatialKey(1,0) has 114 coordinates
Forming DelaunayTriangulations
Preparing input for stitching
Forming StitchedDelaunay
0: (0.5, 1.5, NaN)
1: (1.5, 0.5, NaN)
2: (2.5, 1.5, NaN)
3: (1.5, 2.5, NaN)
Resulting triangles: Vector((0,2,3), (0,1,2))
Rasterizing full point set
Rasterizing stitched point set
Done!
Forming DelaunayTriangulations
Preparing input for stitching
Forming StitchedDelaunay
0: (2.5, 0.5, NaN)
1: (2.5, 2.5, NaN)
Resulting triangles: Vector()
Rasterizing full point set
Rasterizing stitched point set
Done!
Loaded 459 points
GridBounds(655,1579,660,1586)
Computing baseline Euclidean distance tile (raster package)
Baseline has size (1536, 2048)
Computing sparse Euclidean distance (spark)
Stitched has size (1536, 2048)
[0m[[0m[0minfo[0m] [0m[0m[32mEuclideanDistanceSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mDistributed Euclidean distance[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should work for a real data set (2 seconds, 61 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should work in a spark environment (5 seconds, 867 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should work for zero- and one-point input partitions (396 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should work for a linear stitch result (484 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- SparseEuclideanDistance should produce correct results (6 seconds, 686 milliseconds)[0m[0m
14:34:13 TaskSetManager: Stage 0 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:15 TaskSetManager: Stage 10 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:15 TaskSetManager: Stage 11 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:15 TaskSetManager: Stage 12 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:16 TaskSetManager: Stage 14 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:16 TaskSetManager: Stage 15 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:16 TaskSetManager: Stage 16 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:16 TaskSetManager: Stage 17 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:17 TaskSetManager: Stage 18 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
14:34:17 TaskSetManager: Stage 19 contains a task of very large size (989 KB). The maximum recommended task size is 100 KB.
[0m[[0m[0minfo[0m] [0m[0m[32mTileLayerRDDFilterMethodsSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSpaceTime TileLayerRDD Filter Methods[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should filter out all items that are not at the given instant (756 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should produce an RDD whose keys are of type SpatialKey (85 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should obliviously drop the temporal dimension when requested to do so (non-unique) (243 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should obliviously drop the temporal dimension when requested to do so (unique) (327 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mSpatial TileLayerRDD Filter Methods[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly filter by a covering range (244 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly filter by an intersecting range (179 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly filter by an intersecting range given as a singleton (184 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly filter by a non-intersecting range (194 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should correctly filter by multiple ranges (271 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should filter query by extent (330 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should filter query by point (465 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should filter query by point (temporal) (443 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mRasterSourceRDDSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mgeotrellis.spark.store.file.cog.COGFileAttributeStoreSpec *** ABORTED *** (28 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 0.0 failed 1 times, most recent failure: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/cea.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/cea.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mgeotrellis.spark.RasterSourceRDDSpec *** ABORTED *** (7 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m java.io.IOException was thrown inside describe("Match reprojection from HadoopGeoTiffRDD"), construction cannot continue: "No matching file(s) for path: /jobs/genie.geotrellis/GeoTrellis%20New%20CI/workspace/jdk/jdk1.8.0-latest/spark/target/scala-2.12/test-classes/vlm/aspect-tiled.tif" (RasterSourceRDDSpec.scala:84)[0m[0m
14:34:17 SparkSession$Builder: Using an existing SparkSession; some configuration may not take effect.
[0m[[0m[0minfo[0m] [0m[0m[32mLessSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mLess Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less between an integer and a raster (2 seconds, 471 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less right associative between an integer and a raster (2 seconds, 470 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less between a double and a raster (2 seconds, 47 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less right associative between a double and a raster (1 second, 619 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should check less between two rasters (2 seconds, 712 milliseconds)[0m[0m
14:34:29 BlockManager: Putting block rdd_2_3 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/all-ones.tif.
14:34:29 BlockManager: Block rdd_2_3 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 3.0 in stage 0.0 (TID 3)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/all-ones.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/all-ones.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Task 3 in stage 0.0 failed 1 times; aborting job
14:34:29 BlockManager: Putting block rdd_5_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif.
14:34:29 BlockManager: Block rdd_5_1 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 1.0 in stage 1.0 (TID 5)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 BlockManager: Putting block rdd_5_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif.
14:34:29 BlockManager: Block rdd_5_0 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 0.0 in stage 1.0 (TID 4)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 1.0 in stage 1.0 (TID 5, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Task 1 in stage 1.0 failed 1 times; aborting job
14:34:29 TaskSetManager: Lost task 0.0 in stage 1.0 (TID 4, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 2.0 in stage 1.0 (TID 6, localhost, executor driver): TaskKilled (Stage cancelled)
14:34:29 BlockManager: Putting block rdd_8_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif.
14:34:29 BlockManager: Block rdd_8_1 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 1.0 in stage 2.0 (TID 8)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 BlockManager: Putting block rdd_8_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif.
14:34:29 BlockManager: Block rdd_8_0 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 0.0 in stage 2.0 (TID 7)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 1.0 in stage 2.0 (TID 8, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Task 1 in stage 2.0 failed 1 times; aborting job
14:34:29 TaskSetManager: Lost task 0.0 in stage 2.0 (TID 7, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 2.0 in stage 2.0 (TID 9, localhost, executor driver): TaskKilled (Stage cancelled)
14:34:29 BlockManager: Putting block rdd_11_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif.
14:34:29 BlockManager: Block rdd_11_1 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 1.0 in stage 3.0 (TID 11)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 BlockManager: Putting block rdd_11_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif.
14:34:29 BlockManager: Block rdd_11_0 could not be removed as it was not found on disk or in memory
14:34:29 Executor: Exception in task 0.0 in stage 3.0 (TID 10)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 1.0 in stage 3.0 (TID 11, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Task 1 in stage 3.0 failed 1 times; aborting job
14:34:29 TaskSetManager: Lost task 0.0 in stage 3.0 (TID 10, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:29 TaskSetManager: Lost task 2.0 in stage 3.0 (TID 12, localhost, executor driver): TaskKilled (Stage cancelled)
14:34:29 TaskSetManager: Lost task 3.0 in stage 3.0 (TID 13, localhost, executor driver): TaskKilled (Stage cancelled)
14:34:30 BlockManager: Putting block rdd_14_3 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-multiband/result.tif.
14:34:30 BlockManager: Block rdd_14_3 could not be removed as it was not found on disk or in memory
14:34:30 Executor: Exception in task 3.0 in stage 4.0 (TID 17)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-multiband/result.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:30 TaskSetManager: Lost task 3.0 in stage 4.0 (TID 17, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-multiband/result.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:30 TaskSetManager: Task 3 in stage 4.0 failed 1 times; aborting job
14:34:30 BlockManager: Putting block rdd_17_0 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif.
14:34:30 BlockManager: Block rdd_17_0 could not be removed as it was not found on disk or in memory
14:34:30 Executor: Exception in task 0.0 in stage 5.0 (TID 18)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:30 BlockManager: Putting block rdd_17_1 failed due to exception java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif.
14:34:30 BlockManager: Block rdd_17_1 could not be removed as it was not found on disk or in memory
14:34:30 Executor: Exception in task 1.0 in stage 5.0 (TID 19)
java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:30 TaskSetManager: Lost task 0.0 in stage 5.0 (TID 18, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
14:34:30 TaskSetManager: Task 0 in stage 5.0 failed 1 times; aborting job
14:34:30 TaskSetManager: Lost task 1.0 in stage 5.0 (TID 19, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif
at java.net.URI$Parser.fail(URI.java:2848)
at java.net.URI$Parser.checkChars(URI.java:3021)
at java.net.URI$Parser.parseHierarchical(URI.java:3105)
at java.net.URI$Parser.parse(URI.java:3053)
at java.net.URI.<init>(URI.java:588)
at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)
at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)
at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[0m[[0m[0minfo[0m] [0m[0m[32mHadoopGeoTiffRDDSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mHadoopGeoTiffRDD[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should filter by geometry *** FAILED *** (337 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 0.0 failed 1 times, most recent failure: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/all-ones.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/spark/src/test/resources/all-ones.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should read the same rasters when reading small windows or with no windows, Spatial, SinglebandGeoTiff *** FAILED *** (132 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 1.0 failed 1 times, most recent failure: Lost task 1.0 in stage 1.0 (TID 5, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should read the same rasters when reading small windows or with no windows, Spatial, MultibandGeoTiff *** FAILED *** (169 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 2.0 failed 1 times, most recent failure: Lost task 1.0 in stage 2.0 (TID 8, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should read the same rasters when reading small windows or with no windows, Temporal, SinglebandGeoTiff *** FAILED *** (135 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 3.0 failed 1 times, most recent failure: Lost task 1.0 in stage 3.0 (TID 11, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_1.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should read the same rasters when reading small windows or with no windows, Temporal, MultibandGeoTiff *** FAILED *** (243 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 4.0 failed 1 times, most recent failure: Lost task 3.0 in stage 4.0 (TID 17, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-multiband/result.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles-multiband/result.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m- should read the rasters with each raster path handling *** FAILED *** (154 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 5.0 failed 1 times, most recent failure: Lost task 0.0 in stage 5.0 (TID 18, localhost, executor driver): java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:299)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1165)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.Task.run(Task.scala:123)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.lang.Thread.run(Thread.java:748)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31mDriver stacktrace:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1889)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1877)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.Option.foreach(Option.scala:274)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m Cause: java.net.URISyntaxException: Illegal character in path at index 38: file:/jobs/genie.geotrellis/GeoTrellis New CI/workspace/jdk/jdk1.8.0-latest/raster/data/one-month-tiles/test-200506000000_0_0.tif[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.fail(URI.java:2848)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.checkChars(URI.java:3021)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parseHierarchical(URI.java:3105)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI$Parser.parse(URI.java:3053)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at java.net.URI.<init>(URI.java:588)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at geotrellis.spark.store.hadoop.HadoopGeoTiffRDD$.$anonfun$apply$1(HadoopGeoTiffRDD.scala:133)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:484)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:490)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m ...[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMinSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMin Double Zonal Summary Operation[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over whole raster extent (1 second, 114 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get the correct min over the whole raster extent for a MultibandTileRDD (923 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over a quarter of the extent (1 second, 73 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over a quarter of the extent for a MultibandTileRDD (2 seconds, 498 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mMin Double Zonal Summary Operation (collections api)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over whole raster extent (989 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get the correct min over the whole raster extent for MultibandTiles (1 second, 218 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over a quarter of the extent (1 second, 75 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- should get correct double min over a quarter of the extent for MultibandTiles (1 second, 179 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mIterativeCostDistanceSpec:[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32mIterative Cost Distance[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should correctly compute resolution (1 millisecond)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should correctly project input points (949 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should propogate left (1 second, 907 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should propogate right (2 seconds, 65 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should propogate up (1 second, 690 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[32m- Should propogate down (1 second, 860 milliseconds)[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[36mRun completed in 28 minutes, 39 seconds.[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[36mTotal number of tests run: 1439[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[36mSuites: completed 107, aborted 7[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[36mTests: succeeded 1418, failed 21, canceled 0, ignored 0, pending 0[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m*** 7 SUITES ABORTED ***[0m[0m
[0m[[0m[0minfo[0m] [0m[0m[31m*** 21 TESTS FAILED ***[0m[0m
[0m[[0m[31merror[0m] [0m[0mFailed tests:[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.RasterRegionSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.ingest.IngestSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.RasterSummarySpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.file.cog.COGFileSpatialSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.hadoop.HadoopIngestSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.http.util.HttpRangeReaderProviderSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.hadoop.HadoopGeoTiffRDDSpec[0m
[0m[[0m[31merror[0m] [0m[0mError during tests:[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.http.util.HttpRangeReaderSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.store.GeoTrellisConvertedRasterSourceSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.hadoop.cog.COGHadoopAttributeStoreSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.store.file.cog.COGFileAttributeStoreSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.store.GeoTrellisRasterSourceSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.store.TestCatalogSpec[0m
[0m[[0m[31merror[0m] [0m[0m geotrellis.spark.RasterSourceRDDSpec[0m
14:57:00 factory: Disposing class org.geotools.referencing.factory.epsg.hsql.ThreadedHsqlEpsgFactory backing store
14:57:00 ENGINE: Database closed
19/11/27 22:04:10 INFO SparkContext: Invoking stop() from shutdown hook
19/11/27 22:04:10 INFO ShutdownHookManager: Shutdown hook called
19/11/27 22:04:10 INFO ShutdownHookManager: Deleting directory /tmp/genie.geotrellis/spark-9fb97a60-d46c-4c08-8706-11c14c9d39ed
19/11/27 22:04:10 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/27 22:04:10 INFO MemoryStore: MemoryStore cleared
19/11/27 22:04:10 INFO BlockManager: BlockManager stopped
19/11/27 22:04:10 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/27 22:04:10 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/27 22:04:10 INFO SparkContext: Successfully stopped SparkContext
19/11/27 22:04:10 INFO ShutdownHookManager: Shutdown hook called
19/11/27 22:04:10 INFO ShutdownHookManager: Deleting directory /tmp/genie.geotrellis/spark-d031c02c-20e5-41cd-b67e-84ace9355977
Build was aborted
Aborted by echeipesh@gmail.com
Started calculate disk usage of build
Finished Calculation of disk usage of build in 0 seconds
Started calculate disk usage of workspace
Finished Calculation of disk usage of workspace in 0 seconds
Finished: ABORTED