9

Not talking about the recent Java/Oracle debacle here, but something I've been wondering recently.

When I first started programming, Java was the big thing client-side, however these days you hardly ever see a Java applet whereas Flash and Javascript are ubiquitous.

Where did Java go wrong? Did Sun decide to concentrate on the server-side and let client-side stagnate? How did Flash become dominant over Java? Flashy (no pun intended) graphics? Marketing?

Never having been a Java developer, I've not followed this closely so I'm interested to hear your thoughts.

Groky
  • 217

5 Answers5

17
  • Firewalls would sometimes block java applets, so developers couldn't be sure if they were going to work or not.
  • They required a browser plug-in, which led many developers to favour javascript over java applets, so that they didn't require the user to download and install a plug-in. The Macromedia Flash plug-in had a simpler, more seamless, and less noisy means for downloading, installing and updating.
  • The API wasn't so standardized then, so applets could fail to work if the user didn't have the right version of java installed.
  • They had a slow loading time and used too much memory. They would often crash the browser.
  • Java applets died before Swing came out, so GUIs were difficult and ugly.
  • IE wasn't so good at supporting applets fully, so compatibility was difficult to sort out.
  • Flash and javascript are much easier to use for non-programmers.
sam i am
  • 218
7

I believe streaming video was the "killer app" for Flash. Although video had been tried before in Java applets, the frame rate wasn't very high and it required users to install the relatively heavyweight JRE.

Along came Flash with its small install size and (eventually) high video frame rate. It helped that browser vendors started including Flash as part of the default browser installation.

Java is still hindered by a large installation size and slow start times compared to Flash.

Barry Brown
  • 4,095
5

Besides what everybody else pointed out, I'd like to note the developing difference: Java is more appreciated by serious coders and is seen a lot in universities, while Flash is meant to target web developers (programmers too, but as a secondary audience to the platform).

So you see Flash doing things it shouldn't (like whole websites) in the hands of designers, while Java isn't in the hands of designers in the first place.

0

Applets and Flash are different technology. When first introduced, Applets had a lot of security holes in them, you could effectively run your program on the Client side which is not what any customer wants. Later this technology was improved substantially. On the other side Flash is more lightweight than Applets were. People more readily accepted Flash and didn't consider it as a security threat. Apart from minor vulnerabilities that do occur in Flash plugins, Flash was much safe. The initial security setback hurt Applets dearly. Also you are right probably Sun didn't pursue Applets that aggressively.

Something similar to Applets and probably more useful, is the native client that Google is trying to build for Chrome. Let's see how it goes.

Geek
  • 3,961
-1

The whole idea of cross platform UI is flawed. Windows uses very different layout, mechanisms, key presses etc etc to Mac OS X. To make a seamless experience that doesn't irritate the user, you have to make your app exactly right for that platform, taking advantage of the platform specifics used.

Any cross platform solution will always suffer from a "least common denominator" - only the very basic things the different UIs have in common can be used and that makes for a bad user experience all around. This is the same argument all-mighty-steve uses against flash on the iPhone.

I think the irritate the user issue is a very fundamental "soft" problem, that people from a tech perspective often miss.