Friday, November 9, 2012

java.lang.IllegalStateException while configuring Oracle NoSQL


“Big Data” is the well know terminology in today's IT industry. Many people are working in this area. It inspired me to do some experiments with Big Data and I started exploring Oracle NoSQL. Though its much simpler to install and use Oracle NoSQL, I face many challenges. One of them is here.

I came across below exception while configuring KV Store on my system.
Oracle NoSQL DB 11gR2.1.2.123 java.lang.IllegalStateException: unexpected exception creating environment java.lang.IllegalStateException: unexpected exception creating environment
 at oracle.kv.impl.admin.Admin.openEnv(Admin.java:407)
 at oracle.kv.impl.admin.Admin.renewRepEnv(Admin.java:335)
 at oracle.kv.impl.admin.Admin.(Admin.java:230)
 at oracle.kv.impl.admin.AdminService.configure(AdminService.java:252)
 at oracle.kv.impl.admin.CommandServiceImpl$33.execute(CommandServiceImpl.java:629)
 at oracle.kv.impl.fault.ProcessFaultHandler.execute(ProcessFaultHandler.java:142)
 at oracle.kv.impl.admin.CommandServiceImpl.configure(CommandServiceImpl.java:624)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:616)
 at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:322)
 at sun.rmi.transport.Transport$1.run(Transport.java:177)
 at java.security.AccessController.doPrivileged(Native Method)
 at sun.rmi.transport.Transport.serviceCall(Transport.java:173)
 at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:553)
 at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:808)
 at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:667)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
 at java.lang.Thread.run(Thread.java:679)
Caused by: java.lang.ExceptionInInitializerError
 at com.sleepycat.je.rep.elections.Elections.(Elections.java:193)
 at com.sleepycat.je.rep.impl.node.RepNode.startup(RepNode.java:711)
 at com.sleepycat.je.rep.impl.node.RepNode.joinGroup(RepNode.java:1542)
 at com.sleepycat.je.rep.impl.RepImpl.joinGroup(RepImpl.java:459)
 at com.sleepycat.je.rep.ReplicatedEnvironment.joinGroup(ReplicatedEnvironment.java:414)
 at com.sleepycat.je.rep.ReplicatedEnvironment.(ReplicatedEnvironment.java:467)
 at com.sleepycat.je.rep.ReplicatedEnvironment.(ReplicatedEnvironment.java:332)
 at com.sleepycat.je.rep.ReplicatedEnvironment.(ReplicatedEnvironment.java:396)
 at oracle.kv.impl.admin.Admin.openEnv(Admin.java:380)
 ... 20 more
Caused by: com.sleepycat.je.EnvironmentFailureException: (JE 5.0.36) java.net.UnknownHostException: macmini-opensuse: macmini-opensuse UNEXPECTED_EXCEPTION: Unexpected internal Exception, may have side effects.
 at com.sleepycat.je.EnvironmentFailureException.unexpectedException(EnvironmentFailureException.java:286)
 at com.sleepycat.je.rep.elections.TimebasedProposalGenerator.(TimebasedProposalGenerator.java:107)
 ... 29 more
Caused by: java.net.UnknownHostException: macmini-opensuse: macmini-opensuse
 at java.net.InetAddress.getLocalHost(InetAddress.java:1426)
 at com.sleepycat.je.rep.elections.TimebasedProposalGenerator.(TimebasedProposalGenerator.java:62)
 ... 29 more
The problem is created due to not getting correct IP and host name binding. You will find the IP and host name binding in /etc/hosts file. The following steps helped me to solve the issue.

Get Host Name

Identify the host name of your system by using hostname command.
 $ hostname

Get IP address

Get IP address of your system using ifconfig command.
 $ ifconfig

Correct IP and host binding

Comment loopback addresses in your /etc/hosts file. It starts with '127.0.'. and add new line 'ipaddress hostname localhost'
#127.0.0.1 localhost
#127.0.1.1 localhost
192.168.50.50 localhost my-desktop
It solved my problem.

Friday, June 15, 2012

Rake: wrong number of arguments (3 for 2)

Rake is a simple ruby build program similar to make. There are many versions of Rake are available. If you are using Rake-0.9.x you might come to the error, “wrong number of arguments (1 for 0) (ArgumentError)”. While the same piece of code runs without error on Rake-0.8.x.

Here you will find the two methods to deal with this problem. The first one is simple, remove 0.9.x version and install 0.8.x version the latest one is in this series is 0.8.7.

$ gem uninstall rake –v 0.9.x
$ gem install rake –v 0.8.7

The second method is to use rake command with version number.
$ rake _0.8.7_

Monday, April 9, 2012

Hive ODBC Driver on Ubuntu

The Apache Hive is used for managing large datasets residing in distributed storage. It provides the SQL like query language (HiveQL) to interact with datasets. It also provides shell utility which can be used to run Hive queries. But if you need to use the HiveQL from other application/software, some short of driver is required.

The Hive ODBC Driver is a software library that implements the Open Database Connectivity (ODBC) Ahttp://www.blogger.com/img/blank.gifPI standard for the Hive database management systemhttp://www.blogger.com/img/blank.gif, http://www.blogger.com/img/blank.gifenabling ODBC compliant applications to interact seamlessly (ideally) with Hive through a standard interface.

To know how to install and use the driver, I have gone through https://cwiki.apache.org/Hive/hiveodbc.html. I followed the given steps using latest version of Hadoop, Hive and Thrift. But I could not get success to compile the code. After that I have gone through lot many internet pages and finally compiled the code.
http://www.blogger.com/img/blank.gif
The article describes the steps/method which I have used to compile the code.

I have experimented following steps on Ubuntu-10.10.

Install Apache Hadoop

Please refer http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html to download and install the Apache Hadoop on Ubuntu.

Install Apache Hive

Please follow the instruction given on https://cwiki.apache.org/Hive/adminmanual-installation.html to download and install Apache Hive.

Install Thrift-0.5.0

1. Install required tools
$ sudo apt-get install libboost-dev libevent-dev python-dev automake pkg-config libtool flex bison
$ sudo apt-get install php5-dev
$ sudo apt-get install ant
$ sudo apt-get install openjdk-6-jdk
$ sudo apt-get install bjam
$ sudo apt-get install libboost-all-dev
2. Download Thrift-0.5.0
Get the source code for Thrift-0.5.0 from http://archive.apache.org/dist/incubator/thrift/0.5.0-incubating/thrift-0.5.0.tar.gz
3. Install Thrift
Make and install thrift after tar extraction with the following commands:
#Configure and build thrift compiler and libraries
$ cd thrift-0.5.0
$ ./configure –without-csharp –without-ruby –prefix=<thrift_install_path>
$ make
# Install thrift
$ make install
# Configure, build, and install fb303
$ cd contrib/fb303
$ ./bootstrap.sh
$ ./configure –with-thriftpath=<thrift_install_path> –prefix <thrift_install_path>
$ make && make install

Note: Solve compilation errors by including missing include files

Build and test Hive Client

Build the Hive client by running the following command from HIVE_HOME
$ ant compile-cpp -Dthrift.home=<THRIFT_HOME>

Execute the Hive client tests by running the following command from HIVE_HOME/odbc/
$ ant test -Dthrift.home=<THRIFT_HOME>

To install the Hive client libraries onto your machine, run the following command from HIVE_HOME/odbc/
$ sudo ant install -Dthrift.home=<THRIFT_HOME>

If you encounter any issue please refer “Hive Client Build/Setup” section on https://cwiki.apache.org/Hive/hiveodbc.html

Build Unix ODBC Wrapper

In the unixODBC root directory, run the following command:
$ ./configure –enable-gui=no –prefix=<unixODBC_INSTALL_DIR>

Compile the unixODBC API wrapper with the following:
$ make

Run the following from the unixODBC root directory:
$ sudo make install

If you encounter any issue please refer “unixODBC API Wrapper Build/Setup” section on https://cwiki.apache.org/Hive/hiveodbc.html

After compilation, the driver will be located at <UnixODBC_BUILD_DIR>/Drivers/hive/.libs/libodbchive.so.1.0.0.

You can manually install the unixODBC API wrapper by doing the following:
$ cp <UnixODBC_BUILD_DIR>/Drivers/hive/.libs/libodbchive.so.1.0.0 <SYSTEM_INSTALL_DIR>
$ cd <SYSTEM_INSTALL_DIR>
$ ln -s libodbchive.so.1.0.0 libodbchive.so
$ ldconfig

Connecting Driver to the Driver Manager

1. Export LD_LIBRARY_PATH and LD_PRELOAD with proper libraries
LD_LIBRARY_PATH should contain a list of directories having library file libodbchive.so, libhiveclient.so and libthrift.so. Generally all of these files are there in /usr/local/lib/
Use this command to export LD_LIBRARY_PATH:
$ export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH

After that run this command:
$ export LD_PRELOAD=/usr/local/lib/libodbchive.so
2. Connecting Driver to the Driver Manager

Connect the Driver to a Driver Manager as describe in “Connecting the Driver to a Driver Manager” section of https://cwiki.apache.org/Hive/hiveodbc.html.
3. Testing with ISQL
You will be able to interactively test the driver with isql by using following command:
$ isql -v Hive