Sun opened the final day of JavaOne with a general session called "Extreme Innovation". This was a showcase for novel, interesting, and out-of-this-world uses of Java based technology.
VisualVM works with local or remote applications, using JMX over RMI to connect to remote apps. While you have to run VisualVM itself under JDK 1.6, it can connect to any version of JVm from 1.4.2 through 1.7. Local apps are automatically detected and offered in the UI for debugging. VisualVM uses the Java Platform Debugger Architecture to show thread activities, memory usage, object counts, and call timing. It can also take snapshots of the application’s state for post-mortem or remote analysis.
Memory problems can be a bear to diagnose. VisualVM includes a heap analyzer that can show reference chains. From the UI, it looks like it can also detect and indicate reference loops.
One interesting feature of VisualVM is the ability to add plug-ins for application-specific behavior. Sun demonstrated a Glassfish plugin that adds custom metrics for request latency and volume, and the ability to examine each application context independently.
The application does not require any special instrumentation, so you can run VisualVM directly against a production application. According to Sun, it adds "almost no overhead" to the application being examined. I’d still be very cautious about that. VisualVM allows you to enable CPU and memory profiling in real-time, so that will certainly have an effect on the application. Not to mention, it also lets you trigger a heap dump, which is always going to be costly.
VisualVM is available for download now.
It also has a handy developer aid: it warns developers about browser compatibility.
Fluffy Stuff at the Edge
We got a couple of demos of Java in front of the end-user. One was a cell phone running an OpenGL scene at about 15 frames per second on an NVidia chipset. All the rendering was done in Java and displayed via OpenGL ES, with 3D positional audio. Not bad at all.
Project Darkstar got a few moments in the spotlight, too. They showed off a game called Call of the Kings, a multiplayer RTS that looked like it came from 1999. Call of the Kings uses the jMonkey Engine (built on top of JOAL, JOGL, and jInput) on the client and Project Darkstar’s game server on the backend. It’s OK, but as game engines go, I’m not sure how it will be relevant.
There was also a JavaCard demo, running Robocode players on JavaCards. That’s not just storing the program on the card, it was actually executing on the card. Two finalists were brought up on stage (but not given microphones of their own, I noticed) for a final battle between their tanks. Yellow won, and received a PS3. Red lost, but got a PSP for making it to the finals.
Sentilla tried to get out from the "creepy" moniker by bouncing mesh-networked, location-tracking beachballs around the audience. Each one had a Sentilla "mote" in it, with a 3D accelerometer inside. Receivers at the perimeter of the hall could triangulate the beachballs’ locations by signal strength. For me, the most interesting thing here was James Gosling’s talk about powering the motes. They draw so little power that it’s possible to power them from ambient sources: vibration and heat. Interesting. Still creepy, but interesting.
At one point, the presenter wrote down a list, narrating as he went. For item one, he wrote the numeral "1" and the word "pulse", describing the pen as he went. For item two, he wrote the numeral "2" and draw a little doodle of a desktop. Item three was the numeral and a vague cloudy thing. All this time, the pen was recoding his audio, and associating bits of the audio stream with the page locations. So when he tapped the numeral "1" that he had written, the pen played back his audio. Not bad.
Then he put an "application card" on the table and tapped "Spanish" on it. He wrote down the word "one"… and the pen spoke the word "uno". He wrote "coffee please" and it said "cafe por favor". Then he had it do the same phrase in Mandarin and Arabic. Handwriting recognition, machine translation, and speech synthesis all in the pen. Wow.
Next, he selected a program from the pen’s menu. The special notebook has a menu crosshair on it, but you can draw your own crosshair and it works the same way: use the pen to tap the up-arrow on paper, and the menu changes on the display. He picked a piano program, and the pen started to give him directions on how to draw a piano. Once he was done drawing it, he could tap the "keys" on paper to play notes.
The pen captures x, y, and t information as you write, so it’s digitizing the trajectory rather than the image. This is great for data compression when you’re sharing pages across the livescribe web site. It’s probably also great for forgers, so there might be a concern there.
Emphasizing real-time Java for a bit, Sun showed off "Blue Wonder", an industrial controller built out of an x86 computer running Solaris 10 and Java RTS 2.0. This is suitable for factory control applications and is, apparently, very exciting to factory control people.
From the DARPA Urban Challenge event, we saw "Tommy Jr.", an autonomous vehicle. It followed Paul Perrone into the room, narrating each move it was making. Fortunately, nobody tried to demonstrate it’s crowd control or law enforcement features. Instead, they showed off an array of high resolution sensors and actuators. It’s all controlled, under very tight real-time constraints, by a single x86 board running Solaris and Java RTS.
Into New Realms
Next, we saw a demo of JMars. This impressive application helps scientists make sense out of the 150 terabytes of data we’ve collected from various Mars probes. It combines data and imaging layers from many different probes. One example overlaid hematite concentrations on top of an infrared image layer. It also knows enough about the various satellites orbits to help plan imaging requests.
Ultimately, JMars was built to help target landing sites for both scientific interest and technical viability. We’ll soon see how well they did: the Phoenix lander arrives in about two weeks, targeting a site that was selected using JMars.
JMars is both free to use and is also open source. Dr. Phil Christensen from Arizon State University invited the Java community to explore Mars for themselves, and perhaps join the project team.
Thousands of people, physicists and otherwise, are eagerly awaiting the LHC’s activation. We got to see a little bit behind the scenes about how Java is being used within CERN.
On the one hand, some very un-sexy business process work is being done. LHC is a vast project, so it’s got people, budget, and materials to manage. Ho hum. It’s not easy to manage all those business processes, but it sure doesn’t demo well.
On the other hand, showing off the grid computing infrastructure does.
Once it’s operating, the ATLAS detectors alone will produce a gigabyte an hour of image data. All of it needs to be processed. "Processing" here means running through some amazing pattern recognition programs to analyze events, looking for anomalies. There will be far too many collisions generated every day for a physicist to look at all of them, so automated techniques have to weed out "uninteresting" collisions and call attention to ones that dont’ fit the profile.
CERN estimates that 100,000 CPUs will be needed to process the data. They’ve built a coalition of facilities into a multi-tier grid. Even today, they’re running 16,000 jobs on the grid across hundreds of data centers. With that many nodes involved, they need some good management and visualization tools, and we got to see one. It’s a 3D world model with iconified data centers showing their status and capacity. Jobs fly from one to another along geodesic links. Very cool stuff.
Java is a mature technology that’s being used in many spheres other than application server programming. For me, and many other JavaOne attendees, this session really underscored the fact that none of our own projects are anywhere near as cool as these demos. I’m left with the desire to go build something cool, which was probably the point.