digital equipment corporation
Recently Published Documents


TOTAL DOCUMENTS

110
(FIVE YEARS 4)

H-INDEX

5
(FIVE YEARS 1)

2022 ◽  
Author(s):  
THEODORE MODIS

The logistic substitution model is used to study substitutions of different microcomputer models and technologies at the low end of the computer range in Digital Equipment Corporation.


2022 ◽  
Author(s):  
THEODORE MODIS

The logistic diffusion equation is modified in order to take into account the long tail seen in revenues from servicing computer products. The model is applied to computers of DEC (Digital equipment Corporation.)


2022 ◽  
Author(s):  
THEODORE MODIS

The work presented here puts forward a fractal aspect of natural growth. The S-shaped pattern of a logistic function is analyzed in terms of several constituent logistic functions. The approach sheds light on two familiar phenomena: the undulatory evolution of growth, which gives rise to an alternation between high-and low-growth periods, and the increasingly noticeable shrinking life cycle of products. There are some economic and political implications for the European nations. A quantitative example is given for computer sales from Digital Equipment Corporation. The approach is further generalized to suggest that any growth process can be analyzed in terms of natural-growth subprocesses. Applied to human growth this analysis yields precise definitions for the timing of transitions such as babyhood to childhood, and childhood to adolescence.


2019 ◽  
Vol 55 (2) ◽  
pp. 141-160 ◽  
Author(s):  
Michael S. Lewis

This article uses an institutional lens to analyze organizational failure. It does this through a historical case study of Digital Equipment Corporation, an innovator and market leader of minicomputers who faltered and eventually failed during the period of technological change brought on by the emergence of the personal computer. The failure of Digital Equipment Corporation is interesting because it occurred despite its ability to adapt to changing technological forces. An institutional analysis shows that while Digital Equipment Corporation was able to develop personal computers widely considered technologically superior to its competitors, it resisted broader changes occurring in its institutional context. This study suggests that responding to external forces of change, such as technology, may not be enough. An organization must determine if and how such change might lead to a shift in its institutional context and then develop strategies to address such change.


In order to understand the never-ending fights between developers of anti-spam detection techniques and the spammers; it is important to have an insight of the history of spam mails. On May 3, 1978, Gary Thuerk, a marketing manager at Digital Equipment Corporation sent his first mass email to more than 400 customers over the Arpanet in order to promote and sell Digital's new T-Series of VAX systems (Streitfeld, 2003). In this regard, he said, “It's too much work to send everyone an e-mail. So we'll send one e-mail to everyone”. He said with pride, “I was the pioneer. I saw a new way of doing things.” As every coin has two sides, any technology too can be utilized for good and bad intention. At that time, Gary Thuerk would have never dreamt of this method of sending mails to emerge as an area of research in future. Gary Thuerk ended up getting crowned as the father of spam mails instead of the father of e-marketing. In the present scenario, the internet receives 2.5 billion pieces of spam a day by spiritual followers of Thuerk.


Author(s):  
Luca Anselma ◽  
Diego Magro

Configuring means selecting and bringing together a set of given components to produce an aggregate (or a set of aggregates) satisfying some requirements. All the component types are predefined and no new component type may be created during the configuration process. The result of the configuration can be physical objects (such as cars or elevators), non-physical entities (such as compound services or processes) or heterogeneous wholes made of both physical and non-physical parts (such as computer systems with their hardware and software components). The configuration process has to take into consideration both endogenous and exogenous constraints: the former pertain to the type of the assembled object(s) (therefore they hold for all the individuals of that type) and mainly come from the interactions among components, whereas the latter usually represent requirements that the final aggregate(s) should satisfy. All these constraints can be very complex and make the manual solution of configuration problems a very hard task in many cases. The complexity of configuration and its relevance in several application domains have stimulated the interest in its automation. Since the beginning, Artificial Intelligence has provided various effective techniques to achieve this goal. One of the first configurators was also one of the first commercially successful expert systems: a production rule-based system called R1 (McDermott, 1982, 1993). R1 was developed in the early Eighties to configure VAX computer systems, and it has been used for several years by Digital Equipment Corporation. Since then, configuration has gained importance both in industry and in marketing, also due to both the support that it offers to the mass customization business strategy and the new commercial opportunities provided by the Web. Configuration is currently an important application field for many Artificial Intelligence techniques and it is still posing many interesting problems to scientific research.


2011 ◽  
pp. 15-41
Author(s):  
Ashutosh Deshmukh

In the late 1950s and early 1960s, mega corporations of the day began to handle data that rivaled government requirements. This data could not be handled manually, let alone cost-effectively. Accounting and financial information, due to its repetitive nature and heavy volume, became a prime candidate for automation. Initial accounting programs were written for mainframe computers, not surprisingly, since IBM and its Big Irons ruled the computer world. Early mainframe computers were large, due to the ferrite core memory, and cumbersome. The processing intelligence was centralized in the mainframe. Mainframes served a large number of users, and data was processed in a batch mode. Users submitted data using dumb terminals and jobs were processed based on the length of the queue and priority of the jobs. Mainframes provided a high level of security and reliability. Minicomputers, pioneered by the Digital Equipment Corporation, had similar capabilities but were smaller and less powerful. Currently, distinctions between mainframes and minis are very blurred, and for our purposes make very little practical difference.


Sign in / Sign up

Export Citation Format

Share Document