Smart questions
Smart answers
Smart people
INTELLIGENT WORK FORUMS
FOR COMPUTER PROFESSIONALS

Member Login




Remember Me
Forgot Password?
Join Us!

Come Join Us!

Are you a
Computer / IT professional?
Join Tek-Tips now!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

Join Tek-Tips
*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.
Jobs from Indeed

Link To This Forum!

Partner Button
Add Stickiness To Your Site By Linking To This Professionally Managed Technical Forum.
Just copy and paste the
code below into your site.

chiph (Programmer) (OP)
15 Feb 05 13:18
The article says that processor power has topped out, and that for any future performance gains, programmers must learn how to write concurrent software to take advantage of multi-core CPUs and hyperthreading.

http://www.gotw.ca/publications/concurrency-ddj.htm

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

kHz (MIS)
15 Feb 05 14:57
There was an article a week ago about HP Research developing technology that replaces transistors and in doing so creates almost limitless power.  I don't recall where I read the article but it was interesting.  It also mentioned, however, that it would be years before it would be used as a primary method (if ever, I would add.)

And to think this R&D occurred on Carly's watch and they turn around and oust her as CEO.  How fair is that?
kHz (MIS)
15 Feb 05 14:59
Here is a link to an article on the technology from HP Research:

http://www.xbitlabs.com/news/other/display/20050202040129.html
dilettante (MIS)
15 Feb 05 17:28

Quote:

The article says that processor power has topped out, and that for any future performance gains, programmers must learn how to write concurrent software to take advantage of multi-core CPUs and hyperthreading.
Or maybe cleaner code that isn't so power-hungry.  Then again with JVMs and "Frameworks" to contend with...
chiph (Programmer) (OP)
15 Feb 05 21:57

Quote:

Or maybe cleaner code that isn't so power-hungry.

I agree, but with the current drive to get code to be written in the cheapest place in the world, it can only result in bad code being written.  Saw a good one at work the other day:

SELECT a, b, c
FROM tbl_x
WHERE a = a

I can only imagine that template-driven code will become the norm, and we all know how efficient that stuff is.  :-(

The only thing that might replace a careful design done by someone who knows the system inside & out, would be a design done by genetic algorithm, and then there's a serious trust issue -- how do you know that the GA's code isn't working because of some unintended side-effect?

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

dilettante (MIS)
16 Feb 05 8:35
Reminds me of those bloated, finned boats we drove in the 1950s and '60s.  The engineering placed no emphasis on clean and efficient energy use (or for that matter safety).  It took a series of political embarassments before anything was done at all (Nader, the Saudis exerting the muscles car-culture had given them, etc.).

Anyone with any foresight can see a crisis looming in computing too.  Wasteful use of a cheap resource available in bulk from outside sources is a recipe for power plays that change the rules of the game.
chiph (Programmer) (OP)
16 Feb 05 14:04
Larry Osterman at Microsoft has started talking about concurrent programming in his blog:

http://blogs.msdn.com/larryosterman/archive/2005/02/14/372508.aspx

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

pmonett (Programmer)
21 Feb 05 7:42

Quote:

Wasteful use of a cheap resource available in bulk

I think that such waste has been happening since the 286 was released. Every generation of PC hardware (the 286, the 386, the 486, etc..) has seen important increases in computing power, available memory and storage.
And what have we done with it ? Apart from recompiling the same old code from time to time, what have we really done with it ? Clippy ? Streaming video ? Oohh, shiny !

Quote:

get code to be written in the cheapest place in the world, it can only result in bad code being written
If I'm not mistaken, buffer overflows are our daily nemesis, and have been since the beginning of the Internet. These buffer overflows were not implemented by low-pay Indian hacks, they were designed by highly-paid Western programmers. Many of them with degrees in Engineering or Science. Fat good that did.
I don't know anything about education in India (or any other third-world country for that matter), and I am not aware that any Indian-based company has produced any code worthy of recognition yet, but given all the trouble we have with pooly-designed mail clients and OSes now, I fail to see how low-pay programmers can do any worse.

Never mind, it's Monday and I'm probably grumpy.

Pascal.
dilettante (MIS)
21 Feb 05 9:07
I think "bad" in the sense being discussed here was meant as wasteful rather than outright buggy.  The more I think about it though, the real waste is probably in the rapid replacement of hardware just to get the latest and greatest.

Most desktop systems sit running but idle for a lot of hours each day.  So maybe the waste is really in not using those resources to any good purpose.  In that sense maybe using otherwise bloated development environments that resulted in bloated applications makes some  use of the hardware at least.  If there is any gain by doing so (development productivity?) then it may be all to the good, since those overkill machines are just sitting around anyway.

I have to wonder if a typical business desktop really ever needs the resources of anything greater than around 256MB and 500Mhz.  Machines of that power scale ought to be darned cheap and about the size of a really thick paperback book if built using today's technology, with substantially reduced power consumption compared with older ones.  Most of the size would be the hard drive and CD/DVD writer assuming a small external power supply.

At the worst I can't imagine why they'd need to be larger than a case for a CD duplicator (one reader/one writer cases).
RiverGuy (Programmer)
21 Feb 05 9:58
Is code inefficiency really escalating in proportion to  advance in PC performance?  I think one major the reason why the "computer experience" hasn't netted users a huge speed increase is because that there are simply more apps being run concurrently today than before. There are so many apps being run in the background, more multitasking, etc.  As new software ideas come out, more programs come out that are attractive to users.

russellbcopeland (Programmer)
21 Feb 05 12:50
I agree with dilettante in the sense that not using system resources is not always efficient. Sure, if you can get an app to need only 10MB of memory instead of 30MB that sounds great. But is it really? We programmer types often fail to acknowledge the real world implications of our drive to make code smaller and faster. Sometimes bigger and slower can make a lot of business sense.

For example lets examine creating a business process that needs to run nightly to update data.

Lets say we can spend 2 weeks building it and optimizing it to run super effeciently and have it take 15 minutes to run on a $5000 box.

Or we can spend 1 week building it with a RAD tool that takes 1 hour to run on the same $5000 box.

What did we gain by spending the extra week? A four fold improvement in effeciency! Sounds great! But if the business needs are such that we have a window of 3 hours to complete the task each night we really have gained nothing at all! We pat ourselves on the back for being effecient meanwhile the developer with the bloated code is twice as productive and ultimately more use to the business world.

Sure this does not apply to all cases. Sometimes it is worth the extra time and effort to maximize effeciency. It is imperative to be able to do so. It is also imperative from an economic standpoint to know when to just take the shortcut and let the app be technically ineffecient.

I would definitly argue that there are certain classes of applications where effeciency is extremely important. This would be any operating system or program meant to run continuously in the background. With these one should always take the smallest possible footprint to leave the resources available for the apps that actually do the productive work.

chiph (Programmer) (OP)
21 Feb 05 15:31

Quote:

I think "bad" in the sense being discussed here was meant as wasteful rather than outright buggy.

Correct.  I meant it in the sense of lots of cut-n-paste code, where the programmer doesn't have a sense of what it does, but just finds something that sortof works, and futzes with it until it works adequately.

And, this isn't limited to outsourced code.  There are plenty of western programmers who do it too.  But my experience has been that it primarily comes from overseas programmers.

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

pmonett (Programmer)
22 Feb 05 4:22

Quote:

What did we gain by spending the extra week?

Let me see : 2 weeks build time for 15 minutes run time, or one week build time for one hour run time.

What is gained by spending another week optimizing ?
More time for the backups that run after/before the process.
More flexibility for the administrator, secure in the knowledge that an important business process can be rescheduled without severely impacting the nightly run schedule.
The code taking one hour to complete will be judged obsolete sooner because of time constraints, whereas the 15-minute code will be able to justify its usefulness longer, since it will allow for more tasks to run in the same night.

The price of the box is irrelevant, there has to be a box anyway. The cost of the coder(s) is relevant, but that justs pushes back the date at which the code can be deemed to provide value for money (however the company decides to calculate that date).

I agree that quick & dirty is sometimes an acceptable way of doing things. Unfortunately, quick & dirty is what gave us buffer overflows in the first place. And cut&paste code is what has kept them alive.

My opinion is that it always pays to carefully plan an application, and design it as best as is possible - even if it is not destined to support a critical business process.

Pascal.
chelseatech (Instructor)
23 Feb 05 20:56
When you say you have seen "nothing from India..." you are closer that you realise.  The concept of zero was invented by Indian mathmaticians while the Scots and English were running around covered in blue paint hiding from the Roman invaders.  India has a richer and longer mathematical heretidge than most other civilisations.

It is always easy to find faults in foreigners and ignore the worse ones at home.

But hey, what I can I say?  I'm from New Zealand were high technology is the electric fence.

Editor and Publisher of Crystal Clear
www.chelseatech.co.nz/pubs.htm

chiph (Programmer) (OP)
24 Feb 05 11:03

Quote:

I'm from New Zealand were high technology is the electric fence.

You're being modest -- Weta Digital has what must be the largest compute cluster in the southern hemisphere.

You're right -- we wouldn't have gotten very far without the concept of zero.  It's one of those things that everyone just knew, but didn't apply it to counting until someone in India thought of it.

Quote:

It is always easy to find faults in foreigners and ignore the worse ones at home.

I've seen lots of bad code written by US programmers, too.  I'm not denying it.

What I suspect is happening is that the Indian software industry is about where we were in 1998 -- they're hiring everyone who can spell "object" and can fog a mirror.  As a consequence, the quality of code is rather low.

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

lionelhill (TechnicalUser)
24 Feb 05 11:11
Coming from a thoroughly western education, I remember being told at university not to re-write code that I could already take from somewhere else (in those days, Fortran and I'd have to type it in again). In fact this was the very thinking that later grew into black-box object-ism, and thence to rapid application development tools.

How can we criticise people for cut-'n'-pasting, and not bothering to understand what the black boxes really do, if we told them to do it that way in the first place?
chiph (Programmer) (OP)
25 Feb 05 12:13
AMD has just demonstrated their dual-core CPUs, and will be shipping this year.

http://www.amd.com/us-en/0,,3715_11787,00.html?redir=CPPA64

Chip H.

____________________________________________________________________
Click here to learn Ways to help with Tsunami Relief
If you want to get the best response to a question, please read FAQ222-2244 first

GwydionM (Programmer)
26 Feb 05 10:45
Every new computer has been an improvement, from my viewpoint.  I can do the same things faster and also a few extra bits.

------------------------------
An old man who lives in the UK

BocaBurger (Vendor)
8 Mar 05 10:12
Computers are not the issue. It is the software that allows them to do something, or to crash faster. The new dual core CPUs will now allow multi-threaded applications to crash two threads, independantly.

BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!

jsteph (TechnicalUser)
9 Mar 05 8:08
Several years ago there was a thread about increasing cpu power for PC's, and I feel the same now as I did then--it's a big yawn.   

Sure, for a server, you want the power.  But my guess is that 95% of business desktops are overpowered.  You just don't need this kind of power to run Word, Excel, IE, and most of the commercial business apps out there.  Memory--yes--most desktops might benefit from more memory due to the bloat and inefficiencies mentioned.  But more power wouldn't be noticed by most business users.  

Gamers, yeah.  Graphics, yeah.  But not business desktops.

Me, I want internet bandwidth and memory.
--Jim
lionelhill (TechnicalUser)
9 Mar 05 8:35
jsteph, I agree entirely. But look at it another way: Virtually universally, the more senior a manager is, the larger his chair. I've never seen any evidence that managers develop larger or more delicate bottoms as they get promoted. In the same way, they always have bigger desks, even though it's their secretary who probably needs the bigger desk.

There are not a lot of senior managers who will tolerate having an older, slower cpu than their staff. I'm sure that's one reason why desktops get so vastly powerful.
Sympology (MIS)
9 Mar 05 12:33
Lets go back a few years one of my last non pc boxes was an Atari Falcon 16mb with 1gb drive (trust me that was huge and the 16mb cost me close on £200) and an accelator (32mhz i think)card.
On this I ran Steinbergs Cubase Audio, a word proccesor and a decent imaging program.
Many a time we put it up against my friends state of the art P90 (clocked to 110mhz) with 128mb and a 2gb drive running 95.

In nearly every case the Atari trounced the pc in rendering, opening, saving, converting various files. As for Audio well the Atari was in a league of it's own.

Now bearing in mind the proccesing and memory differences, why did the Atari win. In my opion it's simple, as the Atari had so little processor power and memory (a standard falcon was 4mb and 16mhz), the programmers had no choice but to write good quality code.
It was the same with games. Poor graphics and poor sound mean't one thing, to survive you needed gameplay. Now if you have a poor game, stick lots of pretty sound and graphics and hope no one notices.
I'm sure the writers of the game MDK stated they developed on low spec, poor quality machines. If the game slowed down, crashed or generally was of poor quality, they went back and rewrote the code until it worked. Now that was quality programming.

Rant over.....
Stu..

Only the truly stupid believe they know everything.
Stu.. 2004

pmonett (Programmer)
10 Mar 05 2:24
Talking about quality programming, anyone ever play Chuck Yeager's Air Combat ?
It was a flight sim from 1991. It played very well at the time, and guess what ? IT STILL DOES !
That's right, a game from 1991 that ran on a 386 at 40Mhz now runs on a machine that is 20000+ times faster, and everything still works just like it should.
The guys that coded that game were so good that they managed to make their game react to totally unforseen hardware modifications, and do so gracefully. In 1991, nobody even dreamed of multi-gigahertz processors, or DDR memory, and yet they managed to make their circa-1991 code operate flawlessly on circa-2005 hardware.
Is that quality programming, or what ?

Pascal.
BocaBurger (Vendor)
10 Mar 05 7:45
Darn, you had a 40 mhz 386? Mine was only 16 mhz. DX or SX?

A programmer named Pascal, how did that happen?

BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!

pmonett (Programmer)
10 Mar 05 8:50
DX obviously, an AMD version (one that didn't totally break the floating point unit). It was fun.

As for the name, ask my parents

Pascal.
lionelhill (TechnicalUser)
10 Mar 05 11:01
You know, I can almost guess why. If you had EGA or VGA graphics, you drew on the real screen; therefore to avoid drawing catastrophes (noise, messy images) you had to coordinate with the raster beam, which meant the program speed was determined largely by the screen refresh rate - which is governed by what the human eye will put up with - which is still the same!

(Of course they probably still did a bit of speed-checking. If your machine is so slow that it takes more than a screen refresh to do all the calculations, then the program will suddenly jump in speed as processor speed improves. And I'm certainly not denying that it was quality programming. It takes a bit of skill to work round a raster beam and get good performance out of a minimal EGA/VGA)
CapsuleCorpJX (IS/IT--Management)
8 Apr 05 13:23
Nano-tech might improve processor power.
And hopefully some genius will invent a new computer model different from the Turing Machine (all computers are turing machines).
chiph (Programmer) (OP)
8 Apr 05 13:42
FYI -
AMD is thinking of selling the dual-core Opterons at a discount, making them comparable in price with the single-core processors.  They'll likely go up after the introductory period.

Chip H.

____________________________________________________________________
If you want to get the best response to a question, please read FAQ222-2244 first

jrbarnett (Programmer)
9 Apr 05 6:29
I've just had this link in a newsletter, it is certainly relevant to this thread: http://www.pcw.co.uk/news/1162350

What this means, of course, is that operating systems that only support single CPU systems (eg XP Home) will have to be rewritten or replaced with an SMP aware OS to take advantage of the dual core operation.

John
dilettante (MIS)
9 Apr 05 9:33
I didn't see a mention of that in the article but it stands to reason.  I would not be surprised to see some XP Home SMP Edition arrive on the scene though, but surely a Longhorn Home would have this ability from the start.  Worst comes to worst, you just use XP Pro.

Relatively few people bother with a full motherboard/CPU upgrade anymore, and would buy a new machine.  Since the existing machine with XP Home probably has an OEM license (not transferable to a new machine) there is no impact there.

Those who DO major upgrades to use such a dual-core processor chip would probably be faced with an upgrade to Pro.  Wouldn't you think most people inclined to do this are already running Pro though?

So I guess I agree with you but I don't see a major impact.  White box OEMs like those in Microsoft's System Builder program would just use another OEM license to keep the cost relatively low.
jrbarnett (Programmer)
9 Apr 05 9:53
In fact, re reading the article above, its not even that simple. From that article:

Quote (article):

In effect you get two Prescott P4 CPUs each with Hyper Threading (HT) in a single package, which can help boost PC performance when running suitable multithreaded applications - to Windows, the 840 appears as four virtual CPUs.

XP Pro and 2K Pro only support 2 CPUs. For more than that you need a server OS.
With current systems, this means Windows 2000 Server or Windows 2003 Server on the desktop PC just to take full advantage of the new CPUs.
This will of course need to change in the future to accomodate the new generation of CPUs.
The alternative is to disable Hyperthreading features to avoid the need for expensive server operating system licenses.

John
beanbrain (Programmer)
15 Apr 05 18:38
jrbarnett:

Quote:


What this means, of course, is that operating systems that only support single CPU systems (eg XP Home) will have to be rewritten or replaced with an SMP aware OS to take advantage of the dual core operation.

Unless some clever programmer somewhere comes up with code that sits on top of the OS and parses instructions to the different CPUs.
chiph (Programmer) (OP)
15 Apr 05 21:53
I think people running XP Home will be out of luck.  AFAIK, it can't even take advantage of hyperthreading.

From what I've read, Microsoft is trying to decide what to do with their licensing.  They make a lot of money when companies have to step up to Windows Advanced Server because they need to go to 4 CPUs, and I can't see them giving that up.  Don't forget that the companies also need to buy SQL Server Advanced edition to run on that 4-cpu box.  The regular SQL Server will only recognize 2 cpus.

Chip H.

____________________________________________________________________
If you want to get the best response to a question, please read FAQ222-2244 first

Stevehewitt (IS/IT--Management)
21 Apr 05 16:16
http://www.theregister.co.uk/2005/04/21/amd_dualcore_opteron/

Third paragraph from the bottom. Licensing wise, MS are planning on treating each physical processor as a processor - the number of threads is not an issue.

Reply To This Thread

Posting in the Tek-Tips forums is a member-only feature.

Click Here to join Tek-Tips and talk with other members!

Close Box

Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close