Talk:Workstation
From Wikipedia, the free encyclopedia
Article is a bit confused, would be better to logically seperate random historical bits from the general concept. Workstation also implies a Unix OS or at least something with a compiler - as opposed to a Windows Desktop.
[edit] Cleanup
This article needs cleanup. The article needs to clarify the following:
- General non-OS specific definition of the term.
- Use of term amoung Unix/Linix computers.
- Use of term amoung in the Windows and Mac OS X worlds, as in "Windows NT Workstation".
- Workstations as computers used for specific tasks, such as imaging, video creation, local or remote control or monitoring of machines, devices, etc., (see definition: [1]).
--Cab88 23:30, 16 February 2006 (UTC)
[edit] But... It helps to have general agreement on what a workstation is, or is not
I agree. But keep in mind that this is a relatively complex topic, and that a lot of meaning is bound up in the word "workstation". All at once, its a hardware specification and/or performance criterion, operating system choice, and specific application software deployment, all on one machine.
Also, Microsoft helped pollute things by tacking on "Workstation" to their operating systems, as if to suggest that NT would turn a PC into a workstation, which it most definitely does not. They have also done this with their use of "Engineer" in their MCSE (Microsoft Certified Systems Engineer) diploma. They only stopped when professional engineering regulatory organizations told them to cease and desist. Just about anything that company does mars the technical purity of things.
More to the point, with hardware improvements in PC's to support better multitasking, and with multiple processors, and the rise of Linux on the desktop, many people feel that Linux on powerful PC hardware constitutes a workstation, which some people would go along with, and others would not.
So you see, its hard to do an article like this, when there is not clear consensus on what a workstation is in 2006, vs. what it was in 1986, when the first Apollo workstations came out, based on my reading of the history of technical workstations. I have used Sun workstations for years, and love them for doing any serious technical work, but I also know I could do most of it on a Linux PC if I had to, as long as I don't push it too hard and cause something to crash.
When time permits, I will try to re-organize this article. But you know, I have been searching the net for a while, trying to find an authoritative definition of what exactly a workstation is and is not, how it compares to a fast PC, etc. etc. Its ironic that I may end up being one of the authors of such a document. I find it hard to believe there are not others who would have already done a bang-up job of this, better than I could do...
--SanjaySingh 05:11, 17 February 2006 (UTC)
- I would suggest the following then. Start with the dictionary definition and show how this derived from the 1980's concept of a workstation computer (Unless their where "workstation" computers before then I am not aware of). Then discuss how the workstation hardware of the past was generally more powerful then the average PC and how modern powerful PC's have blurred the distinction. The way the label "workstation" is used in the modern day should be discussed. The Microsoft use of the term in the "NT Workstation" OS can be discussed separately as it seems to relate to not relate to the dictionary and hardware based definitions but it is more about the OS capabilities and intended audience. --Cab88 22:53, 18 February 2006 (UTC)
this definitely needs to be cleaned up. i don't understand the difference between a PC and a workstation
The reason you can't see the difference is that, today, there really isn't one. It's a distinction without a real difference. Frankly, it seems to me that the only people who make the distinction really are more interested in making a prestige-oriented distinction about themselves. The equipment/platform is only a symbol. A modern PC is just as powerful and capable, in principle, as any of the "workstations" that Sanjay is attempting to differentiate it from. A PC can run some version of Unix (Linux or FreeBSD), it has a powerful processor, lots of memory, advanced graphics, etc., etc.
The only thing that makes sense for this article is to express it all in the past tense. The distinctions that remain, such as they are, have more to do with software than with hardware, and those distinctions seem mostly of the snobbish sort, not any real technical issues. The definition of words like "engineer" has nothing whatever to do with this question, it's just a slam against Microsoft. (The person who picks up your garbage is probably called a "sanitation engineer"... so what?) Windows can multi-thread just as well as Unix, perhaps even better. The "workstation" has become a commodity. Deal with it. The word has as much currency as was the old term "mini-computer", which today is a meaningless anachronism. -- RussHolsclaw 00:50, 15 March 2006 (UTC)
[edit] Does anyone in this discussion (other than me) actually USE PC's, workstations, for technical work?
Some comments for Russ. Call me an elitist if you like, thats perfectly fine. But I am within my rights to suggest that you don't see a distinction between PC's and workstations, cause you don't push the machine hard enough to see any differences.
The system level architecture of a typical PC is designed for low-cost, NOT multitasking. I know from experience. Its possible to have computational jobs running in the background on a Sun workstation and the machine still remains usable. On a PC, it bogs down and becomes unwieldy to use. Its not about the CPU, its about the design of everything else. One PCI chipset for 6 slots; small caches, IDE drives... everything made to run one thing at a time. No crossbar switch architecture, no SCSI, no balanced I/O for multiple tasks moving information in multiple directions at once.
Benchmarks tell you performance in ideal conditions, but PC's under multitasking loads do not perform anywhere near what the benchmarks would have you expect. There are millions of armchair analysts who are duped by benchmarks, and who think that dividing clock rates through by the cost of the machine makes them smart, when it proves nothing.
People who buy workstations need a machine to do work that scales up to big and complex models. I am not talking about dinky little AutoCAD models. I am talking about things like fighter planes, or space craft, or DNA research, or chip design, or scientific simulation.
If you think a 32-bit Pentium 4 can even load in the human genome into memory for speedy sequence analysis, I invite you to try it. The Pentium 4 is a marketing driven processor that only achieves its high clock rates by having a ridiculously long 20-30 stage pipeline, which means that it suffers from pipeline stalls if branch predictions are wrong, and its instructions per cycle is lower than its competition. When you start to push things where the memory access pattern exceeds what the caches will support, things will bog right now, and stay down.
Now lets talk about software and related things.
The only reason I mention Microsoft at all is because their use of "workstation" has confused the definition of what workstation hardware is to people that are not familiar with Unix or RISC processors... Microsoft markets inferior operating systems on commodity hardware to the masses, and tells them its a workstation, when its not. I have no idea where you got the notion that Windows multithreading is as efficient as Unix. RISC processors are designed around the idea of multitasking (Sun's register windows), and floating point (DEC Alpha), and multiple processors in one machine. Windows is most certainly not designed from its foundations for these things because the average Windows user does not do jobs that require hours of number crunching while still acting as a server on a network, and also a graphics console. You CANNOT use Windows for multitasking efficiently, or safely. It will either slow to molasses, or crash. You do ONE thing at a time on Windows, or else...
Microsoft tries to tell people they are "systems engineers" when they are not engineers at all. I work in an engineering department, where some academic standards apply. Microsoft software is suitable for entertainment. I like playing Team Fortress. But I would not use Windows for anything important like my neural network simulations. And I stand by my contention that Microsoft has polluted the term "workstation" by their frivolous use of that term in their marketing efforts, in the same way they pervert the word "engineer."
Linux is very cool, but its a 32-bit hobbyist's operating system for the most part. Since it is community driven in terms of its development, it largely follows the demographics of the majority of its adherents, which means 32-bit x86 single processor hardware remains the most common configuration, and therefore most software will be designed around this assumption. The volatile nature of the kernel means that software often requires recompilation or breaks often, and Linux is not yet ready to scale to 64-bit because both the hardware infrastructure is just now going to 64-bit, while UltraSPARC from Sun has been 64-bit since 1995, and there are few commercial engineering applications that run on Linux, due to its open-source nature, and business viability issues.
Apple's MAC OS X, is BSD Unix under the hood, with a very polished usable interface on top. Its the best consumer level operating system out there. Running on 64-bit hardware, multi-tasking, multiple processors, it lacks only for scalability to large scale server hardware, and technical applications to run on it.
Solaris 10 is the worlds most advanced and powerful 64-bit operating system, though it might not be the most endearing. It is the most scalable, to hundreds of CPU's, hundreds of gigabytes of RAM, and highest performing, beating even Linux on many standard tests. AND it has hundreds of high-end applications, ranging from PRO/Engineer, CATIA, for 3D Mechnanical design, and Cadence, Synopsys, Mentor Graphics, for Electronic Design Automation. Also, lots of open-source packages can be compiled easily on this hardware and software platform to run on multiple processors at once seamlessly.
Anyway, despite the less than diplomatic words that have been exchanged thus far, I am glad for the discussion. It means people DO care about this article, and it probably means that when all is said and done, this article will be an authoritative one for people to consult. --SanjaySingh 09:21, 5 April 2006 (UTC)
You make some very good points here, Sanjay, along with interesting information concerning bus architectures, etc. I stand corrected. However, I take exception to the invective you posted on my personal "talk" page. I don't think it would generally be regarded as "diplomatic".
Notwishstanding all of that, I was mostly objecting, albeit clumsily, to the irrelevant invective unleashed at Microsoft for "polluting" the word "workstation" here on the discussion page. The fact is that the word has long been used in a much broader sense than the one you attach to it.
In particular, I recall that during the late 1980's IBM (where I worked for 26 years, starting in 1966) made frequent use of the word in a much broader sense. The high-performance type of machine you mention was described as an Engineering/Scientific Workstation, not just a workstation. At the same time, IBM documents made frequent use of the term in its more literal sense, i.e. a machine one stationed oneself at to do work... work of any sort, that is, including spreadsheets and word processing.
In fact, IBM even used the term fixed-function workstation in reference to what is colloquially called a dumb terminal. (No device carrying an IBM trademark could be called "dumb", of course! :-) )
So, if I were to take issue with anything in the Workstation article itself, I suppose it would be the title, and the first sentence, which lays exclusive claim to the general term "workstation" while characterizing other, more specific, appellations as being the "colloquial" ones.
--RussHolsclaw 13:41, 12 April 2006 (UTC)
[edit] OK, lets work together on this, and move it forward
I was reading the supercomputer page recently, and they have an interesting and perhaps usable template for discussing their technology, and I think their structure can be of use in this article. Supercomputers, like workstations are also a moving target, in that the state of the art of 10 years ago is today's run-of-the-mill machine, or even (a pity) a paperweight. I am a big fan of Seymour Cray. He rocked.
I have written up an article for my department, intended to be like a white paper, that discusses in some detail many of the issues we have been going over in this page.
I am glad you mention the "dumb terminal" as a workstation, because technically it is one end of a continuum of computer power. At the other end, would likely sit a deskside machine, whether its an old Apollo workstation, or a Sun Ultra 450 quad processor machine, which just happens to be the entry level of their workgroup servers as well. It would offer additional context in which to consider where desktop machines were, where they are now, and where they are going.
RISC microprocessors were considered a disruptive technology that enabled a significant advance in desktop machines at the time they were introduced into systems. A good article on workstations would need to track the evolution of networked desktop computing from the earliest days of terminals, to the present day distributed networks of workstations.
I know your experience would help with this, because you have seen the evolution of high end hardware... When I read your bio, I was a little shocked to see 26 years of IBM experience. I was 11 or so at the time you began your IT career. I started trying to learn BASIC on the Radio Shack TRS-80 and save my paper route money when I was 12 or 13 to buy my own computer. Eventually I ended up at U. Waterloo, and picked up some computer architecture along the way. I believe that if we pool our knowledge, we can have a bang-up article that will be authoritative, easy to read, and will be referred to by many people who are seeking to understand this class of machine.
Regards,
--SanjaySingh 03:20, 13 April 2006 (UTC)
Actually, I don't think you read my bio right. I worked for IBM for 26 years, ending 13 years ago. I've been in the business 39 years. I just turned 60 in January. There were no TRS-80's when I started with IBM, just System/360 computers, which I provided support for in those early days. In '76, I built one of the early S-100 bus machines, an IMSAI 8080. Eventually, I got it running CP/M, as soon as I could afford a floppy disk drive for it. They were a bit expensive in those days.
My career with IBM mostly had to do with mainframes, though. I had no contact with Unix until more recent times. I've done a little Unix work in the past few years, but mostly modifying programs written by others. In my mainframe work, I worked on systems that provided remote support to customers. This included devising a protocol for downloading software fixes to System/370 systems, and providing the ability for remote viewing of memory dumps and other software diagnostic data. In fact, this application involved a very early version of base-64 encoding, similar to the type used today to handle email file attachements. That was in about 1973.
One thing I can do is take your "early history" part back a bit further. IBM had an early system of the type that could be called a "scientific workstation" that was introduced in 1959. That was 7 years before I started working for IBM, but the machine was still being sold. It was the IBM 1620. There was another system in 1965, just the year before I joined IBM, called the IBM 1130. The latter machine was built using electronics techology borrowed from the System/360. That was hybrid integrated circuit devices that IBM called Solid Logic Technology, or SLT. --RussHolsclaw 05:14, 15 April 2006 (UTC)
[edit] Mac OS X Bit at the end needs updates
The mention of Mac OS X machines meeting the criteria of workstations, yet it remains to be seen what will happen with the Intel switch part near the end needs updated. It's May 2006 as I write this and Solo and Duo Core Intel processors have been available in iMacs, MacBook Pros, and Mac Minis since February and March (for the latter).
However I don't know the details of the processor or how it counts in this regard. So someone else needs to do it.
[edit] Requested move
- The following discussion is an archived debate of the proposal. Please do not modify it. Subsequent comments should be made in a new section on the talk page. No further edits should be made to this section.
The result of the debate was move. Do not revert this move. --Philip Baird Shearer 23:20, 22 June 2006 (UTC)
Workstation (computer hardware) → Workstation … Rationale: 'Workstation' is a redirect to the longer name article, might as well move it to Workstation. -- Frap 16:44, 12 June 2006 (UTC)
[edit] Survey
- Add *Support or *Oppose followed by an optional one-sentence explanation, then sign your opinion with ~~~~
- Oppose. The term 'workstation' predates its current use in the computer industry; I believe it first came into use in the early days of assembly line manufacturing but I don't have a verifying citation to back that up. User:Ceyockey (talk to me) 00:44, 13 June 2006 (UTC)
- Support Following the KISS principle Jay32183 18:35, 21 June 2006 (UTC)
[edit] Discussion
- Add any additional comments
- Tag the redirect Workstation with Template:R with possibilities. User:Ceyockey (talk to me) 00:44, 13 June 2006 (UTC)
If I had seen this debate earlier I would have voted against it. The term workstation is as User:Ceyockey suggests. But as there is no article on that meaning there is no need to disambiguate this page. When an artile on the term workstation is written this page can be moved again. --Philip Baird Shearer 23:20, 22 June 2006 (UTC)
[edit] misc obesrvations and suggestions
observation:
i've been musing about some of the same questions being debated here and looked up this article to see what wikipedia folk had to say about them, so i'm not terribly surprised to find the debate.
suggestions:
on definitions:
-- point out that the term "workstation" has had different meanings in different contexts and has evolved over the years.
-- two general concepts of workstation:
-- -- a computer designed to be used interactively by one person at a time that is built using techniques typically used for multi-user computers (servers or timesharing minicomputers depending on the historical period) and puts all the associated perfomance and reliability resources at one user's disposal.
[ note this phrase appears mid-article: "Personal computers, in contrast to workstations, were not designed to bring minicomputer performance to an engineer's desktop, but rather..." ]
-- -- a computer and peripherals deidcated to a specialized use (faxing, laboritory instrumentation monitoring, etc.
an entirely different approach to the definition problem is base the definition not so much on the hardware and software and software architecture, but on the business problem and the user.
AFAIK workstations have typically been asigned to users working on large time-sensitive problems that are important to the organization and/or to users whose time is expensive. in other words, firms spend extra money to put big, fast, reliable boxes in the hands of some users and not others based on business criteria. now, how you go about expressing that concisely is another question. :-)
Ericfluger 18:57, 11 March 2007 (UTC)
[edit] incentives to use workstations
the article gives some attention to the business incentives to deploy workstations, but i think a bit more is merited. after reading what's there i still have some basic questions that i think are still reasonably in-scope for an encyclopedia article.
[ i recognize that in a business setting almost any criteria for problem solving is ultimatately reducable to cost/revenue considerations, but having said that... ]
i'm wondering if initial popularity of workstations was due to cost-effectiveness of offloading from the mainframe, or whether they simply provided a practical way of doing things that otherwise might have been quite difficult at the time.
IIRC at the time the workstation concept was starting to take hold, connecting multiple high resolution high speed graphics terminals to one mainframe and locating such termials far from the machine room would have been a serious challenge given the data communications technology of the day. using smaller computers with built in frame buffers side-stepped this problem.
i seem to recall reading (long ago) an interview with an early adopter of networked workstation for software development who said that the driving factor was predictable compile compile times (which came at cost of poorer hardware utilization) that made planning and managing projects much more systematic.
i also wonder if there was a "fashion factor". it became the way folks did things for a while. and users who had their own box at one job (or in college lab) and went on to another job wanted the same thing there too.
so i suspect there may have been more considerations than are currently mentioned. Ericfluger 19:56, 11 March 2007 (UTC)
[edit] thin clients and workstations
while it can be hard to say what's a workstation and what isn't, i think it's safe to say there's a distiction to be made between a workstation and thin client FROM A FUNCTIONAL STANDPOINT.
as far as i can tell, workstations generally provide "local applications processing". thin clients don't.
viewed from a hardware-only perspective things get a bit dicier. in some cases the same machine can be configured as either a workstation or a thin client. (sun's only x-terminal was based on entry level workstation hardware. PCs can be configured to work as thin clients in various ways.)
the sun ray ultra thin clients mentioned in the article are clearly not workstations. no workstation hardware, software, or functionality. while sun's new-ish scalable back end graphic technology may now make replacing workstations with thin clients feasable and even attractive, that still doesn't make the thin client "workstation-like".
i suspect it may be best to mention this alternative and link to a separate article rather than explore it in depth.
Ericfluger 19:55, 11 March 2007 (UTC)
[edit] Processor
Does the phrase "server-level processor" have any objective commonly-accepted meaning in the industry or is it just vacuous PC-magazine-speak? As I recall, every microprocessor chip introduced since the 8085 was said to be "far too powerful for personal computer use and only suitable for servers". Servers...file servers..,don't even need to do floating-point maths. I'm taking it out. --Wtshymanski (talk) 04:44, 27 January 2008 (UTC)
- Server level processor usually refers to Xeon and Opteron processors. It also refers to pure server processors such as the Sun UltraSPARC and IBM POWER. As for all servers not needing floating point mathematics, server simply refers to a computer that serves a client. A file server is just a type of server, not definitive of. As for the 8085 "being far too powerful for personal computing and suitable only for servers" please provide a reliable source for that. Back in those days, servers were built on VAX and other proprietary processors. Mainframes too, were big then. If I am not mistaken, the 8085 was for a terminal that connected to a mainframe. Rilak (talk) 08:41, 27 January 2008 (UTC)
[edit] Workstation is an outdated term
My background with workstations goes back to 1984 and Sun. Back then there was a clear distinction between PCs and workstations. These days I don't believe the distinction exists anymore. Back in the 80's I think the four main requirements of a workstation were that it meet the 3M_computer requirements: a MIPS, a Mega-pixel and a Megabyte, that it would run a multitasking operating system, that it have at least a 17" bitmap display, and that it was connected to a high speed network. Now it happens, at that time most of the machines that met this spec ran a flavor of UNIX, used a window systems, X became the standard, and used TCP/IP and Ethernet for their network connection. The 3M computer dividing line held up until the late 80's and early 90's. While in the late 80's/early 90's the hardware specs of a workstation had increased by more than an order of magnitude, the typical mid-range PC was just starting to meet the 3M spec.
But in the 90's as hardware performance increased, Ethernet and TCP/IP became the standard of for all computers and Mega-pixel large screen displays became affordable the 3M computer differences no longer were that great. A typical high end PC had similar hardware specifications as a typical workstation. And the price difference was typically an a factor of 4 to 10. This is why most hardware vendors either got out of or dramatically scaled back in the workstation market. Yes, you could still buy them but they were very high end machines used for very specific tasks. Different types of CAD mainly. Graphics intensive applications could be usually be done on a high end PC.
The last hurtle to overcome was the multitasking operating system. This happened in the late 90's. Linux and UNIX running on Intel hardware became solid and achieved a production level of quality. Also with Microsoft's introduction of Windows NT, the Microsoft world finally had a true multitasking operating system. So the high-end software vendors started porting their software to the Intel architecture, either UNIX/Linux or Windows NT. This was the death knell for the workstation as a distinct class of machine. For me personally the switch happened in the 97-98 time frame when I switched my Sun workstation for a Linux PC. At this time I changed my thinking about workstations to thinking of them as plain desk top computers. I think this change had been happening for a while. So to me at least the term "workstation" seems like an archaic term like the term "mini-computer". Yes there are still a few mini-computers made, the IBM AS/400 (System 1 now) comes to mind. But if some one is describing a need for new computer as needing a mini-computer, I would think that they had a need to run legacy applications. If you were thinking of running current multi-user applications you would host it on a server, not a mini-computer. Likewise there are still people buying workstations, but I suspect that it is mainly to run legacy applications on legacy CPU architectures.
Workstations were great while they lasted, but I am glad that a typical desk top PC has a very similar user experience as a workstation used to.
Robert.harker (talk) 00:45, 25 April 2008 (UTC)

