Friday, August 8, 2014

Solr 4.9.0 Tomcat SEVERE: Error filterStart

Error

SEVERE: Error filterStart

Solution:

For Solr Version  4.9.0

These jars are required in tomcat/7.0.23/lib/ directory-

slf4j-api-1.7.6.jar
slf4j-log4j12-1.7.6.jar

Thursday, February 14, 2013

Java - String comparison

Does it give you a bellyache?
Never mind, you can compare them like this-

System.out.println(gg1.compareTo(gg3));

Saturday, February 9, 2013

Url encode an HTTP GET Solr request and parse json using gson java library

Make a request to the SOLR webserver and parse the json string to Java classes using the code snippets below.
The JSON to be parsed- The corresponding Java classes- Code snippet to make a web server call and parse using gson-

The Java classes must have the same variable names as the Json's keys. Notice how nested Json elements are handled.

Wednesday, February 6, 2013

Errors running builder 'Google WebApp Project Validator'

Error:

Problems occurred building the selected resources.
Errors running builder 'Java Builder' on project 'curve_app'.
com/google/gdt/eclipse/suite/preferences/GdtPreferences
Errors running builder 'Google WebApp Project Validator' on project 'curve_app'.
java.lang.NullPointerException

Workaround solution:

Close all projects, close eclipse.

Start eclipse using, "eclipse -clean" option.

You can try repeating the process 2-3 times,  if it does not work first time.

Not sure of the exact reason why this appears.


Tuesday, February 5, 2013

Highcharts Phantomjs Export - TypeError: 'undefined' is not an object

Generating a png file from config-

phantomjs-1.8.1/bin/phantomjs highcharts-convert.js -infile config.json  -outfile out1.png -width 300 -scale 2.5 -constr Chart -callback callback.js

Error-

Things to check-
1. The following 3 files should be there in same folder or you have to set the paths to these here-
var config = {
                /* define locations of mandatory javascript files */
                HIGHCHARTS: 'highstock.js',
                HIGHCHARTS_MORE: 'highcharts-more',
                JQUERY: 'jquery-1.8.2.min.js'
        },

2. The 'infile' parameter should have an extension of .json, because that is what it uses to decide between svg input and a configuration json.

Friday, February 1, 2013

Monday, January 21, 2013

Hadoop - OutOfMemoryError: Java heap space

I was writing a hadoop job which processes many files and creates multiple files from each file. I was using "MultipleOutputs" to write them. It worked fine for a small number of files but I was getting the following error for large number of files. I tried increasing the ulimit and -Xmx but to no avail.

2013-01-15 13:44:05,154 FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.OutOfMemoryError: Java heap space
    at org.apache.hadoop.hdfs.DFSOutputStream$Packet.(DFSOutputStream.java:201)
    at org.apache.hadoop.hdfs.DFSOutputStream.writeChunk(DFSOutputStream.java:1423)
    at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunk(FSOutputSummer.java:161)
    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:136)
    at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:125)
    at org.apache.hadoop.fs.FSOutputSummer.write1(FSOutputSummer.java:116)
    at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:90)
    at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:54)
    at java.io.DataOutputStream.write(DataOutputStream.java:90)
    at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter. writeObject( TextOutputFormat.java:78)
    at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat$LineRecordWriter. write(TextOutputFormat.java:99)
    **at org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write( MultipleOutputs.java:386)
    at com.demoapp.collector.MPReducer.reduce(MPReducer.java:298)
    at com.demoapp.collector.MPReducer.reduce(MPReducer.java:28)**
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:595)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:433)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)


Solution:
I used the following configuration values to resolve it-
OPTS="-Dmapred.reduce.tasks=8 -Dio.sort.mb=640 -Dmapred.task.timeout=1200000"
hadoop jar ${JAR} ${OPTS} -src ${SRC} -dest ${DST}