19

For the last couple of years, all of the serious projects I have worked on have been either web based, or had a non graphical user interface (services, command line scripts etc...). I can throw together a WinForms app or do some simple WPF when needed, but I've never really delved into some of the lower level API's like MFC or QT.

I understand that this depends on the situation but in general is it still worth taking the time to learn desktop development well or are applications moving to the web and mobile devices at a pace which makes this knowledge less relevant? Also, do you expect developers you work with to have desktop gui expertise?

Peter Boughton
  • 4,632
  • 1
  • 31
  • 27
aubreyrhodes
  • 1,143

7 Answers7

39

I'd say yes, it is. There's sort of a pendulum effect in program development. First everything ran directly on the computer. Then when the computer became powerful enough to run multiple programs, they got mainframes with dumb terminals. But dumb terminals really suck in terms of usability, so as soon as computers got powerful enough to put reasonable amounts of hardware inside a terminal-sized system, we got personal computers, and everything ran directly on the computer.

Then they invented the World Wide Web, and we're back to a mainframe (server) and a dumb terminal (browser.) But dumb terminals still really suck in terms of usability, and people are starting to relearn the lessons of 30 years ago, and we're trending away from that again. A lot of the really hot development these days is for desktop (or mobile) apps that run locally, but are able to connect to the Internet for specific purposes to enhance their functionality.

Mason Wheeler
  • 83,213
11

Even if you never intend to do desktop dev, I would suggest you get enough experience that you would have an informed opinion on when it is better to use a desktop solution over a web client.

Bill
  • 8,380
8

Yes, but not in the way you are thinking.

GUI Programming is not any more difficult nor does it require specialized skills apart from familiarity with the gui programming interface. Hooking up buttons and windows and controls isn't terribly difficult and is pretty easy with modern programming environments compared to the early days with stuff like MFC. GUI programming is stuff that's fairly easy to learn when its demanded.

However, while hooking up buttons and text boxes is fairly easy, knowing when and where to place buttons, and designing a gui to be used by human beings is very difficult. That is a very valuable and important skill to have. However, the design principles that apply to native interfaces vs the web are very similar.

So learn how to design good user interfaces that are effective and don't confuse users, and you'll get familiarity with the programming for them for free.

whatsisname
  • 27,703
5

It's really going to depend on your situation. I recently worked for a Fortune 500 company who had several projects to convert web applications back into desktop applications (SmartClient/Click-Once). In their particular circumstances it made a lot of sense and eliminated several usability issues their existing apps suffered from.

If you're a full-time employee and your company doesn't generally design desktop apps then it probably doesn't make any sense to be fully up to speed on Winforms or WPF. If, however, you're a consultant and you'd like to be able to offer another service to your clients, then it can't possiblly hurt.

Walter
  • 16,136
  • 8
  • 59
  • 95
4

Hmm, besides GMail, Stack-Exchange and my bank's home banking, I use all the day non-web software. Now with the advent of smartphones and tablets, web application are even less attractive to me (I use my smartphone Facebook client). That's user-side.

Developer-side: in my last 10 years, I worked almost only on non-web software (and my career spanned many very different domains as I worked as a software consultant) and I don't see any future web trend in my job.

So yes, it is still a must learning desktop GUI environments.

Wizard79
  • 7,337
  • 2
  • 43
  • 76
2

Of course "it depends" -- but I think your experience is typical. I have rarely had to create a thick client for any of the applications I have written. Unless there is a specific reason that the client needs to be running on the desktop (connectivity issues or 3D game, etc) -- I believe that it's easier for the developer and admins to maintain one "instance" of the application. If they have the skill set to design a web application they should be OK moving into the desktop app realm generally.

Actually I think it's more important that a thick client developer learn web application programming -- the inherit statelessness of HTTP makes it a more difficult application development paradigm to wrap your head around (or at least you have to do a little more thinking than just slapping controls on a panel).

Don't forget -- you have technologies like Silverlight and Adobe Flex/AIR which can straddle the line between desktop/web application.

Watson
  • 2,262
0

According to the IE9 team:

There shouldn’t be a gap between native and web apps. HW acceleration, fast JS and site pinning starts it off

I think it's a safe bet that these technologies will grow closer together. If you're a java developer, there's very little difference between developing desktop apps and web apps (using GWT). It's not unreasonable to expect more and more "desktop" development platforms to be able to target the browser engine. It's also not unreasonable to expect more and more desktop apps to have a web-like distribution model (auto-updating in the background, sandboxed execution, like chrome).