티스토리 뷰
CDH5.8 버전 업데이트 하고, Zeppline 재컴파일 후
mvn clean package -Pspark-1.6 -Dhadoop.version=2.6.0-cdh5.8.0 -Phadoop-2.6 -Pvendor-repo -DskipTests
Zeppline 에서 pyspark 실행시 아래 발생
Py4JJavaError: An error occurred while calling o0.textFile.
: com.fasterxml.jackson.databind.JsonMappingException: Could not find creator property with name 'id' (in class org.apache.spark.rdd.RDDOperationScope)
at [Source: {"id":"2","name":"textFile"}; line: 1, column: 1]
at com.fasterxml.jackson.databind.JsonMappingException.from(JsonMappingException.java:148)
at com.fasterxml.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:843)
at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.addBeanProps(BeanDeserializerFactory.java:533)
at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:220)
at com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:409)
at com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:358)
at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:265)
at com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:245)
at com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:143)
at com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:439)
at com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3666)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3558)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2578)
at org.apache.spark.rdd.RDDOperationScope$.fromJson(RDDOperationScope.scala:85)
at org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
at org.apache.spark.rdd.RDDOperationScope$$anonfun$5.apply(RDDOperationScope.scala:136)
at scala.Option.map(Option.scala:145)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:1022)
at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:843)
at org.apache.spark.SparkContext$$anonfun$textFile$1.apply(SparkContext.scala:841)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:841)
at org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:188)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)
(<class 'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred while calling o0.textFile.\n', JavaObject id=o137), <traceback object at 0x7f8ad3629098>)
Zeppelin 설치 디렉토리에서
zeppelin-server/target/lib/jackson-databind-2.5.3.jar
zeppelin-zengine/target/lib/jackson-databind-2.5.3.jar
삭제한 후 재시작 후 해결
'빅데이터' 카테고리의 다른 글
대용량 로그 분석 시스템 (0) | 2016.09.12 |
---|---|
로깅을 위한 엘라스틱서치 설정(ElasticSearch for Logging) (0) | 2016.09.02 |
Apache Storm 성능 설정 (0) | 2016.08.09 |
Apache Flume 분산 로그 데이터 수집 (0) | 2016.08.05 |
Ceph 오브젝트 스토리지 (0) | 2016.07.26 |
- Total
- Today
- Yesterday
- 이슈처리
- limits
- 코드
- 명령어
- deview
- example
- engineering
- monitoring
- check
- Linux
- error
- RESTful
- httpd
- File
- Python
- Web
- configuration
- Windows
- 예제
- 번역
- client
- PowerShell
- 외부링크
- MariaDB
- Ansible
- Module
- command
- apache
- mysql
- code
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |