Link to home
Start Free TrialLog in
Avatar of swordfish102
swordfish102

asked on

Software development question

How has the reduction in the cost storage space impacted software development?

Essay question (two pages if possible)
Avatar of COBOLdinosaur
COBOLdinosaur
Flag of Canada image

Well this looks like a school work assignment so:

<snip>
Here is nietod's standard response to academic assignment questions:

We cannot do your schoolwork for you. That is unethical and is grounds for removal from this site
(for both you and the experts involved). We can provide only limitied help in accademic assignments.
We can answer specific (direct) questions, like you might ask your teacher. We can review your work
and post suggestions, again, like your teacher might do.

Do you have specific questions?
Do you have any work on this (incomplete even) that we can review?
</snip>

If I'm wrong about it being school work then perhaps you could enlighten me as to the purpose of the question.

Cd&
 
Avatar of swordfish102
swordfish102

ASKER

This is not for school work! I am writing an article for Software Computing Magazine. Anyone who responds will be credited for there contributions.
ASKER CERTIFIED SOLUTION
Avatar of schwertner
schwertner
Flag of Antarctica image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I am looking for a little more focus on the software development aspect from the reduction of storage space
The cheap storage led to the mass implementation of cheap computers. Because every PC needs at least OS and some application programs the profits of companies like Microsoft, Apple became big. This was a good reason to increase the software development and to improve the OS and the applications. The companies could invest money in increasing software development. A new world - the world of Applications were created (Siebel, roadvision, Viniegt, Financials, CRM, CTI,...). This is a huge segment in the software industry.

The low cost of PC and other computers give the possibility to provide everydesk in every office with a PC. This led to bug push in producing network software (Internet, LAN, WAN). New languages like Java, Java Script, HTML, XML were invented.

The mass usage of computers led also to increased usage of Data Bases. In fact every business site in internet has behind a Data Base - small or big. So the producers of data base software like Oracle, MS etc. increased their investments in producing software. Moreover - the Internet as new media increased the scope of the standard Data Base software. Oracle introduces big software products like Oracle Application server, Oracle Portal, etc.

As result a new branch, called SOFTWARE INDUSTRY was established. As oposite 15 years ago the software producing offices were in fact only software departments of the big producers of hardware like IBM, DEC, ...

In fact every group and every topic in this site is devoted to different branch of the software technologies. And the technologies constitute the software industry.

We have also to not underestimate the fact that 10 Mil. people are involved in IT - big volume of them are software developers.
pssst! swordfish102 - a writer should know that " .. credited for there contributions" should read "credited for their contributions" : )


For i = 1 to two_pages

The reduction in the cost of storage space has impacted software development by allowing people to write great
big sloppy operating systems and great big sloppy
applications and also include a whole bunch of stuff that is of no great value to anyone except it adds a line to the promotional literature and it .......

Next i

leo

Come on . . . . what's up with every one here? My dog is stuck inside my piano, and I could use some help here on this question.

Frank Rizzo
The real problem is that the change in technology is faster than the habits of people, and really the answer to your question si : "It hasn't" (That might make a controversial piece!)

Story: Customer system very slow. Start hacking Rat about, you make too many database accesses. Rat says crap, but argument goes on. Answer: Unix I/O buffer set at 800K (yes - eight hundred kilobytes!) in 128 Megabyte server. This was the default setting from the manufacturer. Give Unix 64 Megabyte, Rat left in peace.

Story: Java language creates all data in memory on the heap. A garbage colllector tidies things up and reclaims memory later. When Java first appeared everybody says "Wait! Java takes too much memory" Nowadays nobody is interested.

But the cheapness of memory has not quite impacted software development. Effectively the suppliers just stuff more and more functionality in the package (Do we really need all these options in Word 2000?).

But I have at least started. In my RatScript language there are no "FileWriters" and "FileReaders". You can only read the ENTIRE file into a string (can be infinitely long) and only write a string into a file. This comsumes memory like mice with cheese, but saves messing around with loops and buffers.

hope that helps.
Friends,

30 years ago (spring of 1971) I wrote my graduate thesis at the University. I work on an old machine writing programms and using octal machine code. The memory of that machine has only 32000 "words". Every word had 37 bits. The machine was settled in a very big saloon. I was happy!

After that I program in Assembler, Fortran, PL/1, Macro/11, RMS, Btrieve, Dbase IV, Clipper, Pascal, ADA ...

Now I am Assoc. Proffessor in Computer Science, PhD, autor of three books in CS, Oracle Developer. ..... I know that you are young and I wish you success and happyness!

Because I will in a better world after 30 years, please tell me how will look the thing called now "computer" in that future time. I am sure you will see it!

Joseph Schwertner
Plain and simple - more memory led to shoddier software since people can use the excuse "memory is cheap these days" to cover up their slow and resource consuming algorithms.
I have just read "The anatomy of an XSLT processor" by Michael Kay (SAXON), who had improved the performance of his processor by a factor of twenty by eliminating usage of Java classes for objects an sticking things back into integer variables. No, more memory  has not impacted software development. It's still like the old days.

PS: I graduated in '70.
The GUI of windows is based of large amount of memory. Every application uses his own (and sometimes very big) part of memory. The alternative is to make swapping of the RAM to disk. So without big amounts of memory the modern Windows application will be not the same. Many guys do not love Windows, but we have to recal that nobody and nothing is perfect. Everything is compromise.
The same for multimedia. Look on the reflection of big memory in databases. Oracle was forced to introduce the LOB's (large Objects - CLOB, BLOB, BFILE) as standart data type in version 8 and later, also a special package DBMS-LOB to manipulate this enormous lafge columns.
Timesharing and real time OS also do not work good enough in small memories.

Many good wishes to BigRat! we are not dead - it is good, it is nice to hear this! BigRat - I wish you all the best and also to the other guys!
The questions point, however, was "how does the reduction in storage cost impact software development" - plainly asking where practices have changed. Not as to whether more functionaily or more data could be stuffed into a smaller box.

I maintain that practices have not changed. Over the last thirty years there have been many paradigms invented or proposed to impact the cost of software development, none of which had really anything to do with lot of memory or its cheapness. In fact we are STILL developing programs like building ships from matchsticks. Even people who write programs in O-O languages, which notably consume memory,  get round to rewrting them conventionally.
when storage was expensive, people used to do anything to save memory/disk, including missing out the century in dates they stored.  This caused the y2k problem.  Noone needs to do this anymore.
hmm i'd like to take some time to read the above, but cannot at moment. At risk of being repetetive, some quips:

1) This has nearly killed the mainframe & (former) mini industry, which used to be the only place to get high # users and common data storage.

2) This has led to recklessness and severe cases of bloat. See also: Easter Eggs. (<more>)

3) This has led to getting (upgrade) of so much junk that you can no longer easily find the stuff you want that used to work

4) Easier now for pirates to swipe stuff and store nearly anywhere, at home or someone's 'free' data store

5) Has not helped Excel use, which seems real stuck at 32,000 regardless of size of HD or amount of RAM. (too much overhead from the embedded flight simulator?)

main thing on development:

6) Easier now to write code elsewhere, like at home. Much early, you could go home and edit,compile, but not test on data. The HD size gives us room to test the data, bu one cannot neglect the internet there, for helping to move the data between the drives.

7) Easier to share the home system with family. Tight space meant tight control over room for things like homework. Now the kids can not only load up homework but games as well. So, we lost the PC, like the mainframe & mini. OK, PCs are cheaper, credit is OK, what we did was buy ourselves a new PC for work and... like the kidz know best (shhh) a game or three.

8) Lots more photo & pictures included, especially web development (although not my personal experience  <sob>))
I hear of movies and soundtracks stored, but that is also not my turf (yet) suspecting a bandwidth issue (and no time (yet) to find out.

9) Integrity. The data does not disappear any more. While there is also raid, the HD economy has made it easier to maintain extra storage, extra HD, extra copies, contingencies, fall-back options, and to create more test cases more quickly for debugging code.

10) This had led to new languages like Java, because now one can count on every user having a tremendous amount of storage to provide to every link that is ever clicked on...   ;^)    

[press inverse on calc_pad on that'un]

No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is:
 - Answered by: schwertner
Please leave any comments here within the
next seven days.

PLEASE DO NOT ACCEPT THIS COMMENT AS AN ANSWER !

Nic;o)
Perhaps a split with BigRat...
Hi jaynee,

Had a hard time judging this one.
A lot of contributions and besides BigRat there are also others with a valuable contribution...

I hoped by proposing schwertner that the others would also react, making it easier for the moderator to decide.
My choice for schwertner is maily based on the vast amount of information in the "head" of the comment line.
Doing as many cleaning Q's as I've done here in programming I can however make mistakes...
So thanks for your comment (just the second one I get after cleaning 200 very old Q's in this TA..) I'm sure the moderator will take it in cosideration!

Thanks and C U !

Nic;o)

Seems only fair to me to comment when you guys are fighting the good fight clearing these up... Maybe I'm the only person who leaves notifications switched on : )
No, the Rat is still listening. Modesty prevented me from proposing myself, so I'm very flattered by your point split proposal.

Mind you I'm nothing in this TA, and the 200 cheeses would help schwertner to move up, so I'm happy with that.

I still maintain that the cost reduction in storage space has not effected SD as much as it should have. We tend to mess around with things that don't matter as if they were very important. Like designing buffer systems, or serial readers and writers to save on memory.
Hmm, a really interesting "revival" of this Q ;-)

My only insight on space vs SD is that technical design is basically a choice between storage space and time.
A normalized database will save on storage, but increase time for the queries. Just the reason why datawarehousing use de-normalized data to have good response times.
Having enough cheap space doesn't really improve SD speed, it just enables some software (like datawarehousing) that wouldn't be affordable otherwise.

I must also agree on your point BigRat as a lot of developers still live in the old "little space" age while it's no longer necessary. But on the other hand a lot of developers miss the basics to see the effects of their programs on the network load....

I'll advise the moderator to do a split !
(I know they're not fond of it, but who cares <LOL>)

Nic;o)
"Having enough cheap space doesn't really improve SD speed"

Oh yes it can. In the old days, never mind about Datawarehousing, you always read files sequentially. We now put large amounts of data into databases, so we don't do sort/merge/update on serial files anymore. Yet all the teaching about CS still goes on about reading files sequentially. In my scripting language I read a file into a string in one operation. There is NO sequentail reading capability. This makes the scripting very easy for the mice. We start getting performance problems around 100MB, but then the data really belongs in the database.

My point is that the paradgms are not changing as fast as the technology.
It seems you have not programmed in the old good days when computers have e.g. 32,000 cells each 37 bits (I did this in 1971-74 on using also punched tapes as input device). We spent most of the time to divide our programs in pieces and to think how to place the data in the memory to achieve an acceptable performance of our aplications. Sometimes the same happens nowadays by Oracle Forms and Reports e.g. when somebody tries to use a LOV with 150000 entries and wondering what happens with the productivity. Nothing else then the well known old friend called "swapping" of the memory and restoring in the memoy the needed pieces of data. The list is long enough ... I will stop here.
Ah ... about the points. Let us forget them! Everybody knows they are phantom. I have 126000 and the third place in the Oracle tread but this not helps me. I loosed my job in the industry being three times Oracle Certified Professional - the only thing I do is to teach my students two times weekly. May be refunding them on split in 4 or more parts ....The decision is yours, gentlemen!
"It seems you have not programmed in the old good days when computers have e.g. 32,000 cells each 37 bits (I did this in 1971-74 on using also punched tapes as input device)."

What!

The Rat built it's own serial machine in the mid sixties and it's first program ran in September 1967 (also with punched-tape).

I had a magnificant collection of Cobol statements on punched cards, which I used to mix-and-match to write programs.

32,000 cells of 37 bits was sheer luxury! I used to have to write programs which fitted into 265 bytes (128 16-bit words).
Hmm, makes me feel young again ;-)
I started with Fortran program's punched in cards on a PDP-11, so I'm no match for the two of you <LOL>

Nic;o)
One of the early machines I played around with (back in '67) was an ICL 1500. The lineprinter was driven by code which you wrote to setup the hammers at each column position for a particular character which you then fired. After printing the row you had to issue a "RaiseSprocket" command so that the paper would slip past onto the next line. You timed in a loop and lowered the sprockets again when you'd let enough paper through.

The programmer's manual warned you about certain sprocket raising and lowering combinations which would make the solenoids overheat and the paper catch fire.

You wrote your program in sections of around 512 instructions, at the end of which you did a drum transfer to get the next bit of code in. All of this was done by self modifing code.

In those days with datum and limit machines one had endless fun clearing the drum to zero and starting a drum transfer to "inverse datum". The OS (hah! that's what they called it) added the datum to your starting address and checked that it was not negative nor greater than limit. The drum transfer wiped the memory clean and the poor operator stood there without the slightest clue what went wrong!

I worked my way up, so to speak, to virtual stroage machines with database technology for filestore and then out came the IBM PC. The twits that made that had not learnt a thing from the fifties and made all the same mistakes again and worse!

Hopefully when Bio-Computing comes out I'll be dead.
Good BigRat!
Nice times! I began without OS on the machine in 1971 - an modern civil Russia computer called "Minsk-22". No OS, no translators, input - punched tape, output - narrow paper. I could change directly every bit of the memory pushing butons on a keyboard.
After that IBM 360 (DOS), IBM 370 (OS), PDP-11 (VMS), IBM-PC (DOS), Clipper, Oracle, 9 exams, 3 times Oracle Certified Professional, PhD, 3 books in Computer Science, Professor in CS, 13 months in U.S., Experts Exchange "Expert" , bla-bla ... nice to speak with you guys ... we were young at that good times.

In fact I only use the possibility to say you Hello and to wish you all the best and success, my friends!!!! :-)
Goodness - if you lost your job, I guess I can't feel all that bad about being made redundant, schwertner!  Hope it comes good for you again.
> Essay

Upon revisit of link, I rec. for point split to BigRat & schwertner as I not only enjoyed their dialog, but it fits the theme requested, and I concur with their claims, and,,, makes me feel younger,,

my 1st computer access was pre-University, via Hollerith, but I had to have long break for a few other avenues and colleges would not include computers for Engineering wannabees. They thought it was either for business or math.

So I never had to "had to issue a "RaiseSprocket" command ", but to keep things going I did have to maintain the printers' paper tape and made up workarounds such as scotch-taping something to blot the lights that's get dust covered which made computer think it was running out of paper. Also did the paper-clip fix, I forget which pieces, one I think was the paper tape piece.

While I got to use some real teletypes, which were aging, they made their programs on paper tape, so no direct access there except selected operator commands, but the 32 Kw mini did let me code through the switches on the front panel. Fun. Top two popular instructions I recall were Fubar, which meant "jump to here", which had ~alarm of computer lights not flickering any more; they just went steady, mostly blackout (zeroes). Used that for debugging code prior to writing improved error handlers. The other was a "jump all the way to the next instruction", a.k.a.: NOP, no-operation. Very handy when placed appropriately, so one could easily insert a "patch" where applicable, at a later time. Memory was core. Meaning, to perform a 'dump' operation, we could pull the RAM from the production box, insert another fresh board, be back 'up' in like no time, and... place the RAM in a separate offline machine for troubleshooting, looking at the actual content at time of some failure. Those days are gone. Defaults now are including code of sending internet packets to vendors upon detection of things like errors, keep users actively doing rebuilds for fixes while troubleshooting.

As fyi, modern federal agencies keep $pending for upgrading, but as they are more political oriented than technical, they continue to use the teletype forms for messaging, upgrade after upgrade. As much as things change, much remains the same. Where managers forced to use computers had pushed "CAPS LOCK" on as soon as available (speaks much of their confidence for propriety), they now run with handheld BlackBerry's, having complete keyboard (tiny), which permit going through hundreds of small eMails quickly, using only their thumb. Brag rights. But they expect programs to be written by single mouse click, and citing economy and budget, want no more than one programming language and only a single vendor to have to deal with (M$). Use FUD or ArthurAnderson_accounting to validate their decisions.

First computer game: Star Trek (grid). First enjoyable computer: Pr1me (mini). Their (now) footnote in history was all S/W and H/W was backward compatible (yes, not only upward, but...for real!). They were technical folk, perhaps to a fault (less sales than competitors, maybe harder to sell terms like: "backward"). Ours was open, more for development than business, but still so critical that favorite game of "Adventure" (explore cave) was limited to only so many moves before getting timed out. But it was real-time, unlike our Univac workhorse, which needed punched cards. While waiting for output, we'd set on the 'popular' Pr1me. If their was a terminal available. That encouraged some of us to revise schedules to become night-people. (you are inside the cave. it is dark. watch your step. beware qwerty). I believe that more of the Pr1me popularity was due to it also running freewares, than to the fun of it being interactive. It had a shell capability to be a Unix (free from AT&T - they were not S/W company and coveted non-support back then) and as Adventure was also open_source, though Fortran IV, from time to time it was added to, more paths, caves, almost dynamic capability, shared globally. One initial mod was probably the 1st 'cheat code' for games, to help those with contraint of time_out continue to explore deeper into cave. As I recall, the xyzzy cheat was even included in the first mass distribution. Price of disk had nothing to do with development of cheat codes, that ran independent thread.

Whatever we said above, nothing much changing over time, for this Q (is there one?):

> How has the reduction in the cost storage space impacted software development?

Not at all. It is the size increase that is impacting, resulting in a collection of paddings of bad code, useless functions, pasties of other programs that may or may not work, programmer personalizations and personal game development, dead code, extra cheats and vulnerabilities, embedded chats and back-doors, commonly referred to as "Bloat" even "Bloatware".

Unlike term: "bug" (old usage), I do not recall term: "bloat" used until more recently, and claim that for development, it was more due to the size issue, oriented more for business use (ex: pictures for data-cells of spreadsheets) than personal.

We are now rich people. Due to economy of disk. While we had floppies that were larger, the ten-er more began popularity, but my intro was to fix on the eight incher. What to do when they go bad? = used to be economical question. Example, those with plants and not great green thumb could place around stem, to help retain moisture, to free up some more time. In some places they were cheap (free) frisbee alternative. Much better at that (extra built-in ecomony) than the three inchers or even fives-ies. Where we had had to push "eject" to get completed printouts, some extra blank pages between the printouts were quite useful for scratch paper. We saved them up. Who'd go after scratch paper like that with zeal any more? Yes, my friend, this technology advancing is making us richer. But it is more a social thing that impacting development. While I no longer have them wide pages for flowcharting, when developing, I no longer have that urgent need for such larger flowcharts (once used to disassemble/reengineer an OS - but they were smaller then).

The disk_space cost issue was more personal than business, and that permitted the developer some freedom of movement, increasing capabilities at greater distance from work/office. This is more category of social influence. Social engineering. Many can now simply stop coding, change a diaper, and pick up where one had left off in the process without having concern for time-outs, secured access, or physically moving a great distance.

The freedom of the internet! Keep it free! Add in another cavern if it suits you!
hi sw102

Not a direct answer to the question (whatever it was) but
since we seem to be going back in time here...

I went to work at the Boeing Airplane Company in Seattle in 1955. After several years in PRS I got involved in analog computers.

We had the BEAC (Boeing Electronic Analog Computer)which was a whole room full of racks of Operational Amplifiers connected directly by wires of different colors with banana plugs on the end. BEAC op amps worked on + / - 50 volts which made the math for programming difficult.

Then Boeing got EASE and PACE analog computers which had op amps + / -  100 volts which made programming and scale factors much more user friendly. Also the wires were on a board about 3X3 feet which could be removed and replaced by another board....so each programmer could work with a personal board.

Analog computers are super good for integration which is at the heart of simulation of physical systems such as airplane simulation. Addition and subtraction of voltages was also easy, but multiplication and division was very difficult.,,, Some very clever multiplication techniques were developed like servo multipliers which would move the tap on a resister...and even more clever the quarter square multiplier which could multiply two voltages and it had no moving parts.

whooee....guess I should write some history stuff some day...

leo



Hello guys,
thank you very much! Your stories encourages me. I feel I am very old, extremely old. But now I am happy because I know  that the old guys work very successful now and that IT is not only for the new generation of programmers.

Again - I enjoy this talk here and wish you all (old guys and young guys) all the best! Be happy and successfull.

The points have no value. So please divide them between everybody on this questions or refund them to asker.
leojl has just taken ten years off me!

Oh yes, the Minsky-Mouse! I had a collegue who went to Russia in the early seventies with an ICL 1906A machine and was very impressed with the Mouse. Everybody programmed in Assembler in those days although there was Russian Algol compiler which optimized in 21 phases or such. Since the instructions were in English and the Comments in Russian two ICL 1933 barrel printers (lovely noisy things) were used and then siccors and cellotape. My friend had married a Finnish girl and Russian was their common language, so he was chosen to go (lucky man).

And now to Boeing. That used to be some company before the Bean Counters moved in. Only technical excellence was important, which is why Boeing had lead the market. But times do change, don't they.

I used to know the Hollerith codes. I'll have to learn them again.

I like the idea of refunding the points! It reminds me a bit of Oscar Wilde: "Don't talk disparigingly about society, Algi, only people who can't get into it do that"
No compelling reason to disagree with the recommendation, and while modesty is definitely becoming in an Expert (after dealing with Cd& for so many years), I figure nico5038 knows what he's doing.

Per recommendation, force-accepted.

Netminder
EE Admin