Oto log z kompilacji programu który liczy liczbę PI. Zastanawia mnie ostrzerzenie które pojawia się w compile-logu.
-Jak usunąć to ostrzerzenie które pojawia się w 3 linijce ?
-Czy jest mi to do szczęscia potrzebne i co to zmieni ?
Nie jestem obeznany w temacie Sparka ani Hadoop, więc bardzo proszę o pomoc i przedstawienie spojrzenia na to zagadnienie.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/02/28 03:28:41 INFO SparkContext: Running Spark version 1.4.0
16/02/28 03:28:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/02/28 03:28:41 INFO SecurityManager: Changing view acls to: Adam
16/02/28 03:28:42 INFO SecurityManager: Changing modify acls to: Adam
16/02/28 03:28:42 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Adam); users with modify permissions: Set(Adam)
16/02/28 03:28:42 INFO Slf4jLogger: Slf4jLogger started
16/02/28 03:28:42 INFO Remoting: Starting remoting
16/02/28 03:28:43 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:52223]
16/02/28 03:28:43 INFO Utils: Successfully started service 'sparkDriver' on port 52223.
16/02/28 03:28:43 INFO SparkEnv: Registering MapOutputTracker
16/02/28 03:28:43 INFO SparkEnv: Registering BlockManagerMaster
16/02/28 03:28:43 INFO DiskBlockManager: Created local directory at C:\Users\Adam\AppData\Local\Temp\spark-3313d7e1-649e-41d3-beed-dbf30e0fafa9\blockmgr-d7ec9b86-3170-4efd-adff-9ad84dce953c
16/02/28 03:28:43 INFO MemoryStore: MemoryStore started with capacity 969.8 MB
16/02/28 03:28:43 INFO HttpFileServer: HTTP File server directory is C:\Users\Adam\AppData\Local\Temp\spark-3313d7e1-649e-41d3-beed-dbf30e0fafa9\httpd-d2c8c0d2-6270-48b8-a61b-9107fc99922e
16/02/28 03:28:43 INFO HttpServer: Starting HTTP Server
16/02/28 03:28:44 INFO Utils: Successfully started service 'HTTP file server' on port 52224.
16/02/28 03:28:44 INFO SparkEnv: Registering OutputCommitCoordinator
16/02/28 03:28:44 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/02/28 03:28:44 INFO SparkUI: Started SparkUI at http://192.168.56.1:4040
16/02/28 03:28:44 INFO Executor: Starting executor ID driver on host localhost
16/02/28 03:28:45 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52243.
16/02/28 03:28:45 INFO NettyBlockTransferService: Server created on 52243
16/02/28 03:28:45 INFO BlockManagerMaster: Trying to register BlockManager
16/02/28 03:28:45 INFO BlockManagerMasterEndpoint: Registering block manager localhost:52243 with 969.8 MB RAM, BlockManagerId(driver, localhost, 52243)
16/02/28 03:28:45 INFO BlockManagerMaster: Registered BlockManager
16/02/28 03:28:46 INFO SparkContext: Starting job: reduce at JavaSparkPi.java:37
16/02/28 03:28:46 INFO DAGScheduler: Got job 0 (reduce at JavaSparkPi.java:37) with 2 output partitions (allowLocal=false)
16/02/28 03:28:46 INFO DAGScheduler: Final stage: ResultStage 0(reduce at JavaSparkPi.java:37)
16/02/28 03:28:46 INFO DAGScheduler: Parents of final stage: List()
16/02/28 03:28:46 INFO DAGScheduler: Missing parents: List()
16/02/28 03:28:46 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at JavaSparkPi.java:33), which has no missing parents
16/02/28 03:28:47 INFO MemoryStore: ensureFreeSpace(3088) called with curMem=0, maxMem=1016950947
16/02/28 03:28:47 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.0 KB, free 969.8 MB)
16/02/28 03:28:47 INFO MemoryStore: ensureFreeSpace(1765) called with curMem=3088, maxMem=1016950947
16/02/28 03:28:47 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1765.0 B, free 969.8 MB)
16/02/28 03:28:47 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:52243 (size: 1765.0 B, free: 969.8 MB)
16/02/28 03:28:47 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
16/02/28 03:28:47 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at JavaSparkPi.java:33)
16/02/28 03:28:47 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
16/02/28 03:28:47 WARN TaskSetManager: Stage 0 contains a task of very large size (977 KB). The maximum recommended task size is 100 KB.
16/02/28 03:28:47 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1001438 bytes)
16/02/28 03:28:47 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1001438 bytes)
16/02/28 03:28:47 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
16/02/28 03:28:47 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/02/28 03:28:48 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 736 bytes result sent to driver
16/02/28 03:28:48 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 736 bytes result sent to driver
16/02/28 03:28:48 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 563 ms on localhost (1/2)
16/02/28 03:28:48 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 852 ms on localhost (2/2)
16/02/28 03:28:48 INFO DAGScheduler: ResultStage 0 (reduce at JavaSparkPi.java:37) finished in 0.867 s
16/02/28 03:28:48 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/02/28 03:28:48 INFO DAGScheduler: Job 0 finished: reduce at JavaSparkPi.java:37, took 1.621955 s
Pi is roughly 3.14624
16/02/28 03:28:48 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
16/02/28 03:28:48 INFO DAGScheduler: Stopping DAGScheduler
16/02/28 03:28:48 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/02/28 03:28:48 INFO Utils: path = C:\Users\Adam\AppData\Local\Temp\spark-3313d7e1-649e-41d3-beed-dbf30e0fafa9\blockmgr-d7ec9b86-3170-4efd-adff-9ad84dce953c, already present as root for deletion.
16/02/28 03:28:48 INFO MemoryStore: MemoryStore cleared
16/02/28 03:28:48 INFO BlockManager: BlockManager stopped
16/02/28 03:28:48 INFO BlockManagerMaster: BlockManagerMaster stopped
16/02/28 03:28:48 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/02/28 03:28:48 INFO SparkContext: Successfully stopped SparkContext
16/02/28 03:28:48 INFO Utils: Shutdown hook called
16/02/28 03:28:48 INFO Utils: Deleting directory C:\Users\Adam\AppData\Local\Temp\spark-3313d7e1-649e-41d3-beed-dbf30e0fafa9