OS SCI-FI: AN ESSAY
Edward E. Rochon
Edward E. Rochon on Shakespir
OS Sci-Fi: An Essay
Copyright © 2017 by Edward E. Rochon
Thank you for downloading this eBook. This book may not be reproduced, copied and distributed for non-commercial purposes, unless prior permission is given by the author.
Your support and respect for the property of this author is appreciated.
Some Other Works by the Author
[Axioms & Theorems: An Essay
City of Light: An Essay
Cubics: An Numbers Essay
EMF Banding: An Essay
Global Warming: An Essay
God & Square Roots
God & Square Roots II
Hardwired Drunk: An Essay
Holographic TV: An Essay
Inexpensive Subs: An Essay
Light Capacitor: An Essay
Logic: An Essay
Number Bases & Digits: An Essay
Pest Control: An Essay
Plan RD: An Essay
SDI: An Essay
Seven Month Pregnancy: An Essay
Space as Infinity: An Essay]
[Super Intelligence: An Essay
The JU Engine
Thinking: An Essay
Tolerating High G: An Essay]
Table of Contents
[_ WWII winds of war, firestorms, un-gay, wretched Alan Turing, code breakers, proto-hackers and sci-fi computers in the Fabulous Fifties! Spaghetti code, precious RAM, machine language and vistas of glory! Haven't we come a long way? We've got module programming, viruses up the ass, Hepatitis C, C+, C++ and Dissembler Language but a distant memory. Is it possible we have overshot the mark and need to go back a bit to our roots? _]
After years of torment going back to my college days on the IBM 1130, through IBM 360 to PC’s up the kazoo, FORTRAN, COBOL, BASIC, C and MCSE certificates, I am thinking things over and suggesting some need to reevaluate the current situation. Identity Thieves of the World, spy agencies, malicious hackers of the word be gone from the private user. This is the substance of the essay with some basic ideas laid out mingled with a bit of camouflaged reminiscences. All ideas mentioned are already in use to some extent, and follow patterns in programming set in the past then applied to operating systems more systematically than currently done.
Chapter 1: Spaghetti or Module
Professor Lonesome George Gauguin sat at his office desk writing programs. We interrupted for some help with our program project, talked a bit about the old days:
WE: What is your favorite programming language?
GEORGE: Machine Language.
WE: Oh, back in the days with Grace Hopper! Is that not hard to write? What about COBOL, this language you teach me?
GEORGE: Yeah, right. I like the good old days before all that.
Lonesome George would encourage us to write module programming even in Assembler Language. Yet, I felt that his heart was never in it. Nothing like a good plate of spaghetti to confuse the meatballs, and the blood of the tomato head to wash it all down. George opined for simpler days I think, just as Paul Gauguin slipped away to Tahiti to live a simpler life than could be had in la Grande Nation Metropolitaine. Now simple in one sense but harder to the uninitiated in another sense. To get to the core of nature simplifies, but the details of the core can complexify. COBOL words simplify but complexify the unseen subtleties by reason of being unseen, by reason of programmers who do not know their architecture and underlying fundamental principles of computing. There you go; you say tomato, I say toma(h)to.
Now, we are a lover of lucidity, simplicity, brevity and order, you would suppose the very stuff of module programming. But we may have overshot the mark.
Then people wanted more user friendly computers, meaning GUI (Graphical User Interface.) We needed bigger and bigger operating systems. People did not want monopoly in computing. Microsoft was forced to accommodate many applications from other companies, requiring opening up code details to an alarming degree. MS wants to hide its codes from copycats, while stealing whatever code it can get away with from other folks. “You Steal, I Steal, Sue You, Sue Me” is a song that the great Commodores of Industry, the Lions Richly rewarded, the God-Lions (Lion-El in Hebrew or at least Heblish?) Well, those folks are itchy for riches, maybe Ri(t)chies in Jive? So like Lion-El Ric(t)hie(s).)
Operating systems were always broken into files (modules of a sort.) Lines of redundant code is cut out and placed into a routine or smaller sub-routine. This requires header and closer lines, but the routine is made bullet proof, has standard input/output, put away in a library, and used over and over. When many lines of inline code are replaced, you save memory. This requires calls that add time to the runtime of the operation. In a program language, the keywords or basic commands are little routines. They are small, run quick, and built into the language.
A Go To and Return keyword move within the program to access lines of code. Incorporated into the language, they run quickly. A long sequence of code could access a certain few lines repeatedly with many Go To lines in conditional statements (If, Else, etc.) These branching statements would also add lines to the program but with less overhead than routines. These branching statements can be very confusing, and the lines accessed have no routine names. There is also the problem of returning to the branching statement for continuation of the process at the point branched from. Routines have embedded coding that returns to the branched statements.
Keywords, like any words could be and in some degree can be placed in phrases. For example, you could have a phrase such as: <<Go To xxxx; Return xx. The two angle brackets (<<) tell the computer a phrase follows. We combine two keywords. The first (x) run denotes a line in the the program. The next (x) run tells the computer to return after x number of lines. So if you jumped to line 7,999 and ran 06 lines, you would return to the branching line after performing line 8,005 (7,999 + 6.) We could also have Go To (x, y) where (x) was the line number and (y) the line run count. There are many computer languages and these are likely in force somewhere. The phrase would be standard and run with fewer lines, moving more quickly, but it would be spaghetti programming and confusing.
When you program like this big time, the text editor would have (and does now) the function of updating all branching commands. So, for example, should 10 lines be added above line 7,999, all references to the line would be updated to 8,009 (7,999 + 10.)
Now, what are we getting at here? Operating systems are getting bigger and bigger. Though more and faster RAM is available, size increases computing time and error potential, and memory corruption potential. Putting redundant program lines into routines saves lines, makes reading programs easier to understand. It also makes it easier for hackers to figure out the lines. This might prompt the application owner to jumble his code, in effect, creating modularized spaghetti coding, unless you have the jumble algorithm to put it back into the original programming block form.
I remember my impression of Lonesome George. I thought of the human brain. A vision came to me. The brain is broken down into modules (left side, right side, mid brain, cortex, etc.) But the bulk of the programming is done in spaghetti coding fashion. The neurons of the brain branch out into many dendrites, joining the central core (axons) with other neurons. Synapses allow jumping in many directions from one axon through numerous dendrites. These are spaghetti like, and the coding is spaghetti coding, very complex. So at the micro level, the brain uses spaghetti programming, but at the macro level does tend to break things into the modules. The brain architecture determines the type of programming, that is to say, it must be spaghetti programming to fit the architecture. This allows parallel analysis and synthesis of ideas and routines performed by humans. I might add that functions are shifted to other modules in the brain to run with and supplement the other area(s).
The routines used in programs do have some spaghetti programming, but to avoid confusion, these can be modularized into subroutines to save on redundant coding and to avoid confusing branching. The thing is, that complex interactions work better with spaghetti code. Many things can be brought together in a more subtle, and ultimately, more compact fashion with spaghetti coding. In the long run, it runs faster and with fewer lines of code for each particular grouping of operations.
The brain is the world’s most complex computing machine. It does more than number crunching. Its structure is the work of God, and God must know what he is doing. So we would be well advised to conform to his program. In effect, we continue to modularize at the macro level but should shy away at the micro level. What about the confusion and maintenance, especially when new programmers replace the old? This is dealt with in the next chapter.
Chapter 2: Super Text Editors
When programmers program, they enter notes in the program to explain what they are doing, as a reminder to themselves and current co-workers, and to others who may work on the program at a later date. These lines are not compiled with the code, being unnecessary, adding to the size of the application, giving away info to hackers. In a debugger, you will see only a corporate name, copyright symbol and date with a version added. Everything else is a jumble of alphanumeric symbols or blank spaces for the most part.
How far do we go in writing spaghetti code? We go as far as humanly possible. In the first place, in spite of its complexity, when well written, it is superior at the micro level to module code. In the second place, it is much harder to be hacked. In the third place, it cuts down on job milking and deliberate obfuscation of things to make firing staff more difficult. The easier it is for the job to be learned, the easier it is to fire and replace employees. The quicker things are learned, the quicker you are likely to be fired in a market economy, or when your boss does not like you in a command economy. How do employees deal with this? They figure out ways to job milk so as to prolong the job. They deliberately strive to make jobs complex so as to make their replacement more difficult and to justify higher salaries. What is complex, requires more brains and experience, so pay me more.
With complex spaghetti code, this behavior will not be necessary. But how to keep track of things?
OK, with advances in computer applications, we have advances in text editors. In the old days, the programmer would write source code (machine, assembler or higher language) on paper, punch it into cards for card readers, place it on tape, etc. Later word processors allowed storing data on files for monitor reading. First, editors were very simple, such as Edlin in old DOS programs. Now we have audio, keyboard, touchscreen input into editors. We have comment icons for apps that allow the comments to be invisible when the basic code is listed. This is how the future of corporate program writing will take place, and is taking place to a great extent right now. We simply go hyper with the enlarged spaghetti code file.
A programmer is given an assignment at the micro-programming level. He writes spaghetti code. He works like a forensic medical examiner. He speaks while he writes. All lines typed in are date/stamped. His audio recording is date/stamped. His complex editor can collate audio, program lines and comments quite easily. Our programmer works writing code for a few hours. Then he stops to read a computer screen printout of all comments synced to all lines written. Audio can misinterpret words when converting them to text. Glib comments might not be clear enough. The programmer goes back over all work and reads all comments. He corrects any misinterpretations by the computer. He rereads his comments for clarity, even for himself. Often, when you read your own notes in textbooks, you cannot understand them. The context of memory that induced the glib remarks is gone. You do not remember these missing contextual memories, so your note is meaningless twenty years later, or even twenty weeks later in some cases, assuming you go on to another task.
He must make sure that all notes are clear to himself no matter how long a lapse of time. He is encouraged and trained to write lucid manuals for others. So the notes on programs are very extensive, well edited, and likely reviewed by supervisors or collaborators writing other parts of the program.
Ideally, the program should be written in machine language. Modern text editors allow better comprehension and input through cut, copy and paste, through libraries of redundant code placed inline rather than as routines. Assembler or Assembly Language is very close to the architecture of the computer that is being written for but can still be deceptive. If you do not think so, why did computer dumps in the past and present print out in hexadecimal code rather than assembler text? All debuggers show binary, base 8, or hexadecimal code. That is why it is dissembler code. It is not nearly as confusing as higher languages in making unintended errors, even the C languages, but machine language is more exact. Strictly speaking, 1’s and 0’s are also source code. Men do not read electrical states in transistor circuits, but binary is closer coding to the machine than any other coding.
As I have written elsewhere, we should not only have binary and hexadecimal, we should have bases in 256 and 256 squared, as well. I have shown that logical construction of numbers can make this happen. By zooming in and out, a large array of 1’s and 0’s compress into a few higher base digits. This allows the programmer to more easily describe certain runs of coding. On top of that, modern text editors can color code or change font and font size of runs of numbers to make things stand out according to the needs of the programmer. Computers are excellent at searching out runs of coding in this manner.
As for using higher languages to write program compilations for different computer chip platforms, forget about it. Bad idea! Use a conversion text editor to load in all source platform info on one side and the target platform on the other side. You need a programmer(s) who understands the platform. Only he is qualified for the task. You may think the higher code was converted correctly, but many hidden bugs are embedded that will be difficult to even detect, let alone fix. Big jobs are broken down into teams of programmers. Office suites have the software to allow joint effort already. We must have programs working to exacting standards of accuracy with such massive programs running, otherwise bug city and hacker paradise.
What do we do with our detailed comments on the applications? We put them under lock and key. It is impossible for anyone to hack into the corporate computer system. The comments go to an offline computer, or offline computer network. Files are stored in removable storage. Security guards protect them with observation, lock and key and guns when required. Hackers must be thieves and are shot on sight. For important stuff, the guard and programmer have dual keys to unlock flash sticks or files and even to re-lock file memory units. When necessary, a VP for the project comes down to use a triple key. Otherwise, the VP receives a log of all withdrawals and returns, comparing programmer logs against guard logs.
These huge spaghetti files are extremely difficult to comprehend for hackers without supporting comments. The programmers have put in tripwires that sound off (alarms) when an unauthorized write is initiated without going through the proper sequence. Of course, hackers love these challenges and will surely try. Why make it easy for them?
As for job milking and obfuscation. It will not be necessary as the following conversation illustrates:
(Bill Gates drops by the old office to check up on the latest source code for Word and underlying OS.)
GATES: Hi boys.
BOYS: Hello, Mr. Gates.
GATES: Just call me Bill, guys. So give me a summary on the latest rewriting into spaghetti code for Word.
BOY1: Sure, Bill. Let’s see, today is Monday; if you can hang around until Friday evening, we should be able to give you a brief summary. For a more detailed description, let’s see, Columbus Day, taking into account the Thanksgiving long holiday, we should be able to get you out of here and home for Christmas.
GATES: OK, why don’t you just give me a copy on disk or flash stick, I’ll read it at my leisure at home between now and Christmas.
BOY2: I’m sorry, Bill, Mr. Gates, but we have been given strict orders not to allow that. It is too risky.
GATES: You know, boys, I still own an awful lot of Microsoft stock. Do you really want to deny me access to info on the company that I started?
BOY3: No, we do not, Bill, and am very sorry, but if we give you the program, we are all immediately terminated. Maybe, you could get us rehired, and maybe not. You realize, that the chain of command must not be tampered with? As a former CEO, you should understand that. So if we are fired, we might as well be fired for doing our duty in the chain of command, a chain that you are no longer a part of.
GATES: Alright, boys, I see your predicament. I’m going to head upstairs and lean on some VP’s and P’s and CEO’s and get this straightened out. So long.
BOYS ALL: Have a Merry Christmas and a Happy New Years, Bill. And Thanksgiving too.
GATES: Same to you, boys. Adios!
Now I would like to talk about enhanced security in the next chapter.
Chapter 3: Super Security
I have several recommendations to deal with the new paradigm and current security problems brought up here:
: Just as the problem of integrated functions into a unified program package was solved by Office Suites and related application packages, we must consider distributed OS to deal with the ever greater need for complexity and reliability. I have mentioned this in other works.
OS is specifically written for each complex application and suite. MS Office covers a lot of territory. It justifies the loading of its own OS specifically written for it. By this means, no one can claim monopoly, as all apps run with their own OS. More than that, if Word is brought up, a specific OS is loaded with the app. Screw the wait time, the complexity of systems already creates wait time. Add to this the need for running anti-virus and OS monitoring functions, coupled with a chronic inability to keep up with malware, the constant integration of fixes, it is worth the wait, knowing that it secures your computer’s integrity.
Now, if you bring up both Word, Excel and the browser, the OS juggles an interface across the platforms. It may shut down or delete duplicate functions within the apps when now handled by the interface.
Ideally, all computers should have a compact operating system embedded in BIOS that can access files, operate input/output, run a calculator with running monitor tape, run a basic text browser, a basic word processor, and simple systems analyzer (Device Manager, etc.) That way if the computer crashes, you immediately switch to the virus proof, crash proof OS to monitor your computer.
The specific OS will allow different OS operating apps in parallel in virtual memory, or by a mechanical switching device not accessible online, to switch between OS systems. You have Windows and Linux up at the same time or in quick sequence.
: The Master Boot Record records the usual information. It also records the last source of the file, and may include prior file sources in an extended optional mode along with date/time stamp. This would also include URL sources for any downloads, and the connection source, landline, wireless source. This means that if you have a flash stick, and you write to your hard drive, the name you give to the flash stick, or a unique ID on the flash stick will be recorded in the Master Boot Record. This has nothing to do with any information in the Properties of the file, but is OS/BIOS driven. Even if you copied something with a virus from your work computer, or a friend’s computer, this file would be tracked down to the input device, and optionally, the prior input device of your friend’s computer and so on back a certain number of iterations. Anti-virus companies often have a hard time tracking down the source. This source information is quite useful, especially for URL’s online. It gives a definite clue as to the source. All crashes should be detected by BIOS and date/stamped. Trackers can link the date of crash to the date/time stamp of a file and its source. It is not permissible to write to disk vicariously. The URL does not work through Word. Word tracks any internal RAM writes until the write from the Internet is erased. This should be standard on all apps by default.
: We must confuse hackers with individualization of systems. Microsoft and Linux test their packages to insert tripwires into their packages based upon a random 256 code keyed into BIOS. These tripwires should not significantly impair performance, and are transparent to the package. What happens is that in addition to Windows describing online validation of companies and requesting user permission to install packages, Windows places tripwires into its OS after shipping and randomly by user input. There are certain ways to write and places to write applications. Microsoft already reserves areas (or did last I knew) that cannot be written to unless the administrator goes through a process to remove this restriction.
There are certain things the various apps should not write in the registry, places that it has no business writing to in general. No hackers can know precisely where these tripwires are. When tripped, the installation stops, a message goes out to the user. It might say that suspicious writes inconsistent with package type have been noted and stopped. The app will state what it is, and this will be confirmed by the user. If the app does not have an app type encoded, the user must divine the type and give the type to the OS. If the app tries to write to a very sensitive area, Microsoft may flat out refuse to install since its prohibition against modifying its software is being violated. If this is some ploy by Microsoft to suppress competition this must be dealt with legally. Microsoft will have a log that remains with the computer. If the user deletes or tries to modify the log, he loses his case automatically and is liable for damages.
Not even Microsoft knows where these tripwires are. For maintenance, Microsoft contacts the user by phone, instructs the user to remove his unique identifiers with a code that comes both online and over the phone for security. Then the user types in another random 256 key input to re-individualize his computer. The more tripwires that are put in without significantly compromising performance, the better. The OS can of course keep track of these tripwires, though randomly inserted in a manner determined by the key.
: For a small fee (the phone will seldom be used), Internet providers provide a linked unique phone link that operates with the landline (perhaps also with wireless in some cases). The phone runs through BIOS. BIOS has a small display screen similar to a phone. Microsoft or an anti-virus company can connect via both links to the computer when the user switches on a mechanical on/off button. The user can see certain coded info that ID’s Microsoft, or security provider. These lines are limited access. You must be licensed and vetted to use them. Of course, any user can hook up and observe the integration between online and phone line validation, but cannot call out himself. This speeds up the validation and monitoring process. It makes it much tougher for hackers.
: Early programmers had to deal with expensive and limited memory, both RAM and to a certain extent storage compared to today. The industry set to work in expanding both. Operating frequencies zoomed. The more spikes in the frequencies, the more data is transferred per unit of time. Miniaturization of electronic components started. Shorter transit time between elements saved time but increased heat dissipation problems. Small elements were more sensitive to static charges and heat burn out. Increasing frequency and dense arrays of elements became a problem. Using co-processors started early to take pressure off the CPU chip. This is a type of paradigm for just in time programming. To keep frequency and heat dissipation in check, several chips were yoked to one basic CPU processing core. So this portends another management stratagem.
Early programmers and programming follows a sequential solution to problems. The parallel processing that evidently goes on in the brain was non-existent or minimal. To break the log jam of heat dissipation and overly sensitive elements, we must spread out processing over space. This reduces power to the fans with little increase in power to span the increased signal distances. If you have 10 layers of circuits measuring 1 by 10 by 10 stacked upon each other, you have a 10 by 10 by 10 cube. All layers dissipate heat. Line them up parallel to each other, you have a 1 by 10 by 10 flat extended surface, much easier to dissipate heat. The problem? It increases transit time between elements. This is where just in time processing helps solve this problem. Many tasks are complex, such as building a house. You have a sequence. The foundation must be built before walls go up. The walls must be up to support the roof. The outer framework allows inner work on doors, and walls provide the framework for windows, wiring runs and so on.
But at a construction site, while the foundation is being laid and you have idle hands, walls could be built off to the side, even with doors in them, providing you have a plan to put them together with equal facility as when just building the house one step at a time. So all complex problems are analyzed for a segmentation scheme. If the one element that can be isolated cannot be used until the main program function meets a certain point, it does not matter that it takes transit time to branch back into the main program function. For a math co-processor, the data is fed via a bus, calculations area made in a separate area, then returned to the main function only when needed. This allows spreading out chips, reducing power for cooling, allowing cheaper lower frequency chips. The trick is to build a pre-programming segmentation software package that can do this. This package must receive input from the personal computer, from the input to be processed, and have its own preset algorithms that may change from program to program.
We would expect that this would complicate the problem of hackers to intrude into the individual computer. They would need to gather much data to mimic this function, and this function would determine whether their virus ran properly or at all. I suspect such a pre-programming package would have innate anti-virus potential. The very breakdown into components might give away hidden coding, deceptive coding infecting a software package.
The Japanese made a name for themselves with just in time inventory control to keep downtime to a minimum and warehousing space and cost to a minimum, giving them expense reduction advantages in manufacturing. The car plant would have just the number of bumpers it needed to keep the plant running for a short prescribed period of time. No great store of bumpers was kept at the factory. Tight integration between different divisions of the corporation and with sub-contractors allowed this. This was the follow-on step to the Henry Ford mass production line, who made sure that work came to the worker at his work station, rather than the worker moving about.
: Rapid code conversion between corporate branches helps secure data. A branch hands out complex but easy code in flash sticks to workers. These encode/decode between local branches and in-house work between computers. If a worker sends info outside his branch, a separate code links the branches to corporate headquarters. His info goes to the regional office, decoded and encoded there and sent by corporate code to headquarters. This is also possible between functions of a PC where the user thinks the sensitivity of the data deserves it.
Analog breaks help keep out viruses. Data is displayed on a monitor. Two monitors are linked by camera in a closed box. The fonts and symbols are uniformly consistent, leaving bar code scanning and text scanning highly reliable. Any typically non-ASCII or international code would be given a symbol, even if only in a hexadecimal grouping. This is an analog digital break. The screen is read and re-digitized for processing. Hidden viruses cannot get through via information. Even execution programs would have a tough time hiding viruses, providing quick analysis of source programming went through debuggers looking for suspicious anomalies with human eyes following the virus scan on a computer printout connected to one of the nifty debuggers described above.
More and more stuff should be placed in BIOS and out of reach of hackers. More hardwired switches that mechanically interrupt access between modules in the PC should be installed. Mechanical switching between OS types kept on separate chips in the PC, whether in virtual memory within a main package, or running parallel with each other should be in the future of PC’s. We cannot progress to the city of light without making serious inroads into the hacker’s power to bring the digital highway down. As mentioned in the preface, everything mentioned here is being done in some form right now. It is simply systematizing and doing it to a greater extent. The symmetry between application packages and OS suites seems obvious to me, if not to the OS producing companies. Yes, current OS gives options to put things in or not into the system, but I have something in mind more systematic and all encompassing than that.
Other Works by the Author
Collected Poems I
Collected Poems II
Elements of Physics: Matter
Elements of Physics: Space
Elements of Physics: Time
Unified Field Theory: An Essay
Space as Infinity II
Golden Age Essays
Golden Age Essays II
Golden Age Essays III
Golden Age Essays IV
Golden Age Essays V
My current biography and contact links are posted at . My writings include essays, poetry and dramatic work. Though I write poetry, my main interest is essays about the panoply of human experience and knowledge. This includes philosophy, science and the liberal arts. Comments, reviews and critiques of my work are welcome. Thank you for reading my book.
A preface suggests we overshot the mark in cutting back on spaghetti programming. Chapter 1 explains why spaghetti programming is better at the micro-level for security, compactness of code. Chapter 2 pushes extremely precise note taking. using modern text editors. These are the key to large spaghetti code, locked away in safes, never online. Spaghetti code makes it tough for hackers, not easier. Offer some suggestions to take full advantage of text editors. Recommend dumping high level languages for programming (including C) and going back to machine language. Chapter 3 proposes some ideas that include OS suites, attaching specific OS programs to specific apps. Master Boot Records that trace source of all downloaded files, including URL's. removable media. Unique OS packages by random 256 key user inserted keys. Each Windows becomes unique, and hard for hackers to find tripwires that foil their dirty deeds. Some other security fixes mentioned.