Technological Change (in my life) part-2
edit: 2021-03-18 (this is a stream-of-consciousness thing)
Epiphany 10: Smart Phones are really Star Trek Tricorders
- The title of this section says it all; today's Smart Phones are the equivalent of Star Trek's Tricorders
merged with Start Trek's Communicators
- We don't use them in the same way as Mister Spock because they appear to be making us less smart
- Why remember anything when you can look it up on the internet?
- Google maps allows us to navigate without building an mental model of our neighborhoods but we are now quickly lost without our phones
- I'm certain Mister Spock was never "playing Pokémon Go" or "checking his likes on
Facebook" so humans have a long way to go before we begin using these devices more responsibly
- And who ever thought it made sense to drive an automobile while texting on their phones? (more accidents today are caused by
texting-while-driving than drunk-driving)
Epiphany 11: "Moore's Law Revisited" and "the Big Data Revolution"
The following was partly inspired by the book Thank You for Being Late
(2016) by Thomas Freidman
- Everyone has heard the apocryphal story about the king who was so grateful for the game of chess, that he offered the inventor any reward
- The inventor asked for payment in rice based upon the following progression over a 64-square board: one grain of rice for square one, two grains
of rice for square two, four grains of rice for square three, eight grains of rice for square four, and so on
- Realizing that the reward would break the bank, the king has the inventor killed (or so the story is told)
The Integrated Circuit (chip)
- The integrated circuit (chip) was first conceived and implemented by two different people working at competing companies:
- 1958: Jack Kilby at Texas Instruments
- 1959: Robert Noyce at Fairchild Semiconductor (now Intel Corporation)
- Wikipedia mentions others who had the idea first, but TI and
Fairchild were the first to manufacture a product
- It takes some time to become operational, and some semiconductor components on a chip are there to act as resistors, capacitors, and conductors so
gross transistor count does not begin until 1960
- Fairchild first releases a commercial 2-transistor chip for sale in 1961
- In late 1964 Gordon Moore was approached by Electronics Magazine to do an article which
would be published in 1965 (the 35th anniversary edition of Electronics Magazine) speculating where the chip industry may be headed over the next 10
- Fairchild had just finished 1964 making 16-component chips and were planning to double the number in 1965
- Moore had noticed this doubling and so predicted (in the article) that doubling would continue each year for a decade
- This became known as Moore's Law
- read the original article here and
here and here
- In 1975 he revised his formula so that doubling would continue every two years
- extrapolation of the chart above-right shows transistors doubling every year from 1960 to 1970 so lets assume Moore's revision (doubling every
two years) begins in 1971
- In 2004 executives at Intel realized that continuing die-shrinks of the x86
P6-micorarchiteure were requiring an exponential amount of human effort in order to gain only a linear increase in CPU
throughput (part of this problem was due to the amount of heat). So they scrapped all on-going P6 development replacing it with
Intel Core-microarchitecture which would facilitate multiple cores rather than faster CPUs.
- In 2012 Intel announced the 64-bit Itanium 9500 which weighed in at 3.1 billion
Big Data (explodes in 2007-2008 but no one noticed because of the financial crash of 2008-2009)
- In 2003 Google published a paper on the Google File System (GFS) which stimulates
others to develop Big Data technology like Hadoop
to allow data center capacity to grow so large that you would never need to delete a file ever again (some people say this is the beginning of the
- According to Friedman, Moore's Law combined with Big Data stimulated a lot of change in 2007 but
these events were obscured by the financial meltdown of 2007-2008
- Activities (2005-2010) include:
|Amazon starts Amazon Web Services
||March 2006; 140-character micro-blogging site
||starting in 2006, Facebook from campus-wide to world-wide
|Apple released the iPhone
||announced: September 2006; released in 2007
|Google acquired Android
||a stripped-down Linux to run the gPhone (given away to members of the Open
|Google acquired YouTube
|Google turning up Street View
||can anyone remember what life was like before this app?
|LinkedIn reaches 10 million members
||2008; repository of open source software
|NASA co-creates OpenStack
||2010 (then puts it into the public domain; just check your Linux distro)
- It is the combination of raw computing power with big data that facilitated inventions like: IBM's Watson, Amazon's Alexa, Apple's Siri,
Microsoft's Cortana, not to mention self-driving cars
- Question: So what was so special about 2007?
Answer: Exponential growth (related to Moore's Law) allowed industry to almost fill the first half of the chessboard with
- using the formula: count = 2^(square-1) we get:
Note: the "one year" interval was revised circa 1974
|| time passes
||last square of the first half of the chessboard
||first square of the second half of the chessboard
|| time passes
||the far future provided Moore's Law can continue
Epiphany 12: GPU's and Moore's Law
The original definition of Moore's Law stated that the number of transistors on a single chip would "double every 18 months" while "costing the consumer
the same amount". (note: after the first decade Gordon Moore revised the doubling period to every 24 months). I remember reading articles in the
mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were attributed to manufacturing problems related to "the limits
of photo lithography" and "gate sizes so small that electrons would be able to quantum tunnel across barriers thus rendering them conductors rather than
Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they charged more money.
Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing the consumer the same amount". So now we
have an economic Moore's Law as well as a technical one.
Advances including FinFET
(GAA) technology, and 3D
have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of the fact that the
computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from Nvidia have beat the technical Moore's
Law fore the last few product iterations.
NVidia science cards (graphics cards without a video connector) break Moore's Law every year
Epiphany 13: The Second Renaissance (Storm before the Calm?)
was a period in European history, from the 14th to the 17th century, regarded as the cultural bridge between the Middle
Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in a renaissance.
- Before Johannes Gutenberg introduced printing to Europe in 1439, it is safe to
say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology enabled an alternative view of Christianity
to flourish which we now refer to as the protestant reformation. Many people thought it was
their duty to reject this change which eventually plunged Europe into the Thirty Year's
War (1618-1648). While many Europeans died unnecessarily (since Christ was non-violent), European culture survived. The main take away from
Gutenberg's technology is this: because many more people were able to get access to books, many learned to read. Some even learned to question
- Gutenberg's invention is usually associated with publishing materials which enabled the protestant reformation but I think this view misses
the point: it also helped to publish new theories in natural philosophy (science) and mathematics when those fields began to explode ~ 150 years
after Gutenberg's death in 1468.
- Galileo Galilei is best known for employing a telescope (a.k.a. Dutch Spy Glass) to
observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes version of the story has him
going up against the Vatican who clung to their belief that every heavenly body orbited Earth which was located at the center of god's universe.
Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live out the remaining years of his life under house arrest (he was too
well known to be put to death). YOU WOULD THINK that Galileo could have better defended himself by convincing his inquisitors to look into the
telescope eyepiece while it was pointed at the moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device
was the work of the devil.
It needs to be pointed out that without a telescope to extend human senses, it is very difficult to determine if "the
sun orbits the earth" or "earth orbits the sun" (although non-telescopic measurements by Tycho
Brahe come very close but you would need to be a mathematician to interpret the results). This is the beginning of an era where technology
was used to extend the limits of the human senses.
- Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope
which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to extend human senses
into a new realm.
- There have been countless improvements in telescopes and microscopes since those days but it would appear that humanity had hit a new limit. I say
"had" because the inventions of the smaller while more powerful computers, along with computer networking via the internet, have ushered in a new
age of astronomy and biology where we have put electronic eyes on our machines. For example, computers associated with the Kepler
Spacecraft are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers,
robotic sequencers, and the internet enabled the first two Human Genome Projects
as well as the Human Proteome Project
- The internet actually gets its start in 1969 as ARPAnet but is never taken seriously by the
world until 1989 when Tim Berners-Lee invented the World
Wide Web to solve a document sharing problem at CERN. (yep, the American military invented
the internet but European scientists made it useful to the world; It would appear that this renaissance began in Europe as well)
- Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita basis) than did the
efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
- The downside of the internet is that it has facilitated communications between purveyors of various conspiracy theories (like-minded weirdoes),
and has recruited more people into terrorist organizations like ISIS. It has also facilitated a large crop of radio-like internet broadcasters which
have further divided the population into political tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election
of a real estate tycoon for President of the United States. On the flipside, society might not be as divided as we think: While the political left
dukes it out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only represent
- This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries ago. While some
people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty
Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of non-violence; not war or
inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim Berners-Lee invented the web, we don't need to
engage in a thirty year's war to put an end to our social madness. Just as Gutenberg introduced technology which enabled the advance of science
while (eventually) reducing the importance of religious dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing
of the past.
Epiphany 14: What's in the water in the Nordic/Scandinavia Countries?
Epiphany 15: Big Data renders first-past-the-post voting as irrelevant
This next section requires the reader to have some familiarity with a few terms. If you are familiar with these then jump
- Demographics and Demographic
- Big Data
Facebook and Cambridge Analytica - A summary of what happened
re: American presidential election of 2016
re: BREXIT vote of 2016
Comments - Observations
- Facebook sold access to their customer data. Their business partners included Global
Science Research and Cambridge Analytica to only name two of many
- Facebook (and all the other social media outlets) have always done this. It is how you get to use their platform free-of-charge.
- This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements which most people never
- In 2014, part-time Cambridge University researcher, Aleksandr Kogan (who was also a
co-director of Global Science Research), created a voluntary questionnaire in the form of a Facebook
app called This Is Your
- in all versions of this story, this app was used to collect your Facebook "Likes and Dislikes" as well as a list of your Facebook friends
- in many versions of the story, this app also contained "Likes and Dislikes" of your friends (comment: since those
secondary level people were never presented with an acceptance agreement, this might be illegal if not downright unethical)
- in some versions of this story, there is mention of a 40-item politically-oriented questionnaire
- Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica,
a data mining firm now known to be working for Trump's 2016 presidential campaign.
- We later learned that Cambridge Analytica was also working with Canadian company AggregateIQ
which was working with the Leave
Campaign prior to the British BREXIT referendum
- The results of the voluntary questionnaire were run through a psychographic analysis which
resulted in the Facebook participants being slotted into ~ 27 categories
- I can't remember where I first saw the number "27" but any number of categories could be selected depending upon how the analysis was
- It is now believed that Cambridge Analytica collected data on 87 million Facebook users
- Some of the categories identified Facebook users who...
- never vote
- always vote Republican
- always vote Democratic
- are centrists (in the political middle) -and- might be convinced to vote one way or the other
- might vote for Hillary Clinton
- This last group was targeted with advertising which may have masqueraded as fake news. The
intent of this targeted advertising was to tilt the result ever so slightly in the desired direction (pro-Trump or
pro-BREXIT). Something like this can work in spades in any a first-past-the-post (winner-take-all) election or plebiscite. In the case of the
Hillary Clinton supporters, all that was needed here was to convince her centrist supporters to "change their" vote or "stay home"
- Even the silicon valley big-wigs didn't see this one coming.
- Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics, democracies need to shift to
proportional representation. Big data and psychographics will be able to affect these as well but this would be happen without a winner-take-all
- Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.” There is no way you will be able to convince the
winning side of the American election that there should be a do over; especially now that Trump has already been sworn in. But Britain has not yet
left the EU so could engage in a do over. However, I suspect that the winning side in BREXIT are thinking politically rather than rationally.
- I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans continue to play this
blame-game, they are setting themselves up for repeat performance from their own social media corporations.
Epiphany 16: Technical revolutions are (sometimes) just evolutions
When you talk to people about the Industrial Revolution most only think about one
big change beginning some where between 1800 and 1900. But this is a gross oversimplification if you consider that it started with the age
of steam, then transitioned to the age of electricity, then transitioned to the the information age. When we talk about technology in the information
age should we begin with computers, or should we first start with getting information to people? (scrolls, books, telegraph, radio, television,
cable television, internet).
The Evolution of Locomotives
Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn the wheels. Europe quickly experienced
wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. Now it is well known that humans burned petroleum for
over 5,000 years but it wasn't until the mid-1800s that the industrial
revolution commercialized petroleum extraction and refinement. Steam locomotives eventually morphed into diesel
locomotives where the fuel is burned in combustion engines to directly operate pistons (eg. no intermediate steam is required). But the immense
power was difficult to control via a transmission so diesel locomotives morphed into
diesel electric systems where a diesel engine runs an electrical generator which is then used to power electrical motors. At this point you can
see that if external electricity is available then a locomotive might be made more efficient by doing away with the diesel engine and electrical
generator. It would definitely weigh a whole lot less.
The Evolution of Computers
- Before computing machines, the computing label was reserved for people (read: human
- Electromechanical computers were popular in the 1940s (using relays developed for the telephone industry)
- One good example of this is the American computer known as ENIAC
- Tommy Flowers was an engineer for the British Post Office (which ran the British
telegraph and telephone systems) who helped build Colossus. Contrary to popular
belief, Colossus was a hybrid system mostly comprised of electrical relays although vacuum tubes (which the Brits correctly refer to as
Thermionic Valves) were installed in key portions of the CPU.
- Vacuum tube based computers were popular in the 1950s
- many people do not know that the British system was
- Transistor-based computers were popular in the 1960s
- Integrated circuit-based computers were popular in the 1970s
- Integrated circuits have morphed from small scale integration, to medium scale integration (MSI), to large scale integration (LSI) to very large scale integration also known as (VLSI)
- Some examples of MSI + LSI are chips known as ALUs (arithmetic logic units) but chip evolution eventually reached a stage where a whole CPU
could be build using one chip, and that chip was known as the Intel 4004
- Personal Computers (short list):
- I will skip over a discussion of industry's shift from CISC to RISC
- the internal wars were reminiscent of the Holy Crusades (lots of religious dogma)
- RISC won out and the companies that didn't adopt it went out of business except Intel
- I still don't know if x86-64 is CISC or RISC; Intel says it is RISC but the base instructions are the same CISC instructions from the 8086;
Intel says they decompile CISC instructions into RISC instructions (which are run on an internal RISC machine) but this sounds an awful lot like
minicomputers of the 1970s and 1980s that translated CISC instructions into microcode which was run through a sequencer (eg. open one or more
data paths; move the data; close the paths; strobe the ALU; etc.)
- NO OS
- The first computer I ever used was an Interdata Model 70. That particular installation
had no operating system which meant the program ran on bare metal (er, silicon) after being fetched from a cassette tape
drive. In those days, the programmers who wrote stand-alone applications needed to write their own device drivers. Writing your own disk drivers
was possible but not easy. Imagine writing your own routines for data-error-detection and data-error-correction.
- Since the Interdata Model 70 was an architectural clone of the IBM-360 it meant that third-party software libraries could be
purchased which provided ready-built routines for communicating with various devices including: serial interfaces (printers and terminals) as
well as tape storage devices like cassettes and 9-track tape decks. These
software libraries coexisted with standalone programs and were informally known as Tape Operating Systems (TOS)
- On the next iteration, computer manufacturers themselves provided (for a fee) Disk Operating Systems (DOS) with
numerous predefined libraries so it was no longer necessary for programmers to reinvent the wheel for every computer solution under
consideration. After all, who would know better how to communicate with a disk drive than the engineers who built it?
- This name is a bit of a misnomer since a DOS could also support all the devices before it including tapes
- All manufacturers inserted network routines into their DOS systems which would allow programmers to utilize the manufacturer's proprietary
networking protocols. But I never heard the phrase Network Operating System (NOS) until SUN Microsystems published
SunOS/Solaris which had built-in support for TCP/IP (The Internet) as well as support for disks and tapes.
- This name is a bit of a misnomer since a NOS could also support all the devices before it including disks and tapes
- Most people realized that networking improved human-to-computer connectivity but a few visionaries realized that computers could now connect
- Cloud computing is all the rage this side of y2k with numerous definitions of what is meant by that phrase. The most popular one is that, like
a cloud, humans are not really certain where they are currently working (could be any bunch of computers in a data center; or could be in any
one of a number of data centers around the world; no need to worry about a computer blowing a power supply or a drive going bad because their
are redundant copies of everything everywhere; hardware is now so cheap that it makes more sense to protect the user's data). One software
paradigm called IaaS (infrastructure
as a service) can be implemented with a free product called OpenStack (this is just
one example of many)
- Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software together are
multiplicative (think: logarithmic on steroids)
- Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44
employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry would never get over
the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor obsolete
- Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their favorite computer
hardware and OS
Epiphany 17: What is old is new again (Python vs BASIC)
- I just learned (mid-2018) that Python is now a more popular web
interactively as well as standalone on almost every computer platform in use today.
- Python is an interpreted fourth-generation language meant to replace BASIC (a third generation
language which began by being mostly interpreted but today is mostly compiled).
- Why is Python not compiled?
- computer systems this side of y2k are so fast that there is little difference between interpreted and compiled "for some applications"
- (send a command to the internet then wait; receive some data then send a command to a relational database then wait; in this instance
Python acts more like the conductor of an orchestra)
- many cloud computing environments (known as heterogeneous clouds) are comprised of different CPU technologies and/or operating systems. For
your web application to run anywhere you would either "need precompiled binaries for each CPU flavor" (bad) or "would need to run an interpreted
- Python is already very fast and I don't know why or how (kind of reminds me of some
Forth programming I did in the 1980s)
- Humorous Observation:
- Before I started my computing career, I learned interpreted BASIC on a Heathkit
H8 , Apple II and TRS-80 (each
implementation was different)
- Moving to compiled languages (COBOL, FORTRAN, Pascal, HP-BASIC, VMS-BASIC, C, C+) showed me the true power of
- I am ending my career learning interpreted Python 3.8 on Linux (CentOS-7.4) as well as Windows-10
- Unlike the "many chefs" BASIC debacle 40 years ago, Python development is in the hands of one organization (The
Python Software Foundation) which means that we will only see one flavor of the Python language although there are already two. Be aware
of the fact that (in 2019) many internal Linux utilities (like YUM and FIRWALL-CMD) depend upon Python-2.7 which means that sloppy additions of
Python-3 to your system will break Linux utilities as I have documented here (yes,
multiple instances of Python can be supported on the same Linux system if you are careful -AND- if your Python scripts begin with a
- Noodling Around on your computer
- IMHO, the personal computer revolution of the 1970s and 1980s was important to our society because it allowed non-specialists to noodle
around with ideas in BASIC. Now anyone who has tried to do serious work in BASIC will soon learn about implementation limitations (e.g. maximum
size of integers or limited precision of floats). Moving to other languages is not much different. For example, it is almost impossible to write
a BASIC demo for the Diffie-Hellman Key Exchange and difficult to do
it in C without resorting to non-standard libraries. But check out these hacks in Python 3.7
# author : Neil Rieck
# created: 2019-08-22
# purpose: demo to show that Python is better than BASIC (and most
# other languages) for serious work or noodling around
import math # math library
print("2^32 :",2**31) # fails with 32-bit signed int in BASIC
print("2^64 :",2**63) # fails with 64-bit signed int in BASIC
print("2^128 :",2**128) # difficult in many languages
print("2^256 :",2**256) # ''
print("2^512 :",2**512) # ''
print("2^9999 :",2**9999) # ''
print("2^99999:",2**99999) # works in Python
pi : 3.141592653589793115997963468544185161590576171875
1/3 : 0.333333
2^32 : 4294967296
2^64 : 18446744073709551616
2^128 : 340282366920938463463374607431768211456
2^256 : 11579208923731619542357098500868790785326998466...
2^512 : 13407807929942597099574024998205846127479365820...
2^9999 : 99753155844037919244187108134179254191174841594...
Note: "..." is where I chopped the line for this display
- click here to see my Python hacks involving DFT + FFT
Epiphany 18: Industrial Revolutions can be disruptive
The First Industrial Age (Age of Steam)
The Second Industrial Age (Age of Electricity)
- History tells us that steam engines existed before 1800 but many visionary's saw what was coming. For example, Adam Smith worried about what might
happen when machines replaced people. He advocated government-run schemes to cut weekly work hours (which would put more people to work) as well as
reeducation programs. These were some of the reasons for writing his two main books:
- The Theory of Moral Sentiments in 1759 (same year as On
the Origin of Species by Charles Darwin)
- An Inquiry into the Nature and Causes of the Wealth of Nations in 1776
(same year as the American Revolution)
- James Watt introduced his improvement to the steam engine in 1776 which made more efficient
use of coal as a water-pumping device in mines. Note that this modification made steam engines more fuel efficient, thus practical.
- William Murdoch writes a patent application on the steam
locomotive in 1784
- So why do historians pick a general starting year of 1800? I think that this is when a particular technology reaches a critical mass where it
affects most of society.
- Anyway, the point to remember is that once human jobs were lots to steam engines, there was no going back.
- New jobs were created for people to manufacture, install then maintain this technology but the number of new jobs never matches the jobs lost.
The Third Industrial Age (Age of Data/Information)
- The first time an electrical telegraph was suggested was 1753 and
yet biographies of Michael Faraday and James
Clerk Maxwell makes me wonder if anyone had a clue before 1900. For example, Georg Ohm
published Ohm's Law in 1827 and yet, many people who thought they knew better were
responsible for damaging underwater communication cables as late as 1858 as documented
- Alexander Graham Bell and Elisha
Gray co-invent the telephone in 1876
- Thomas Edison was generating DC power from the combustion of coal as early as 1882
- Nikola Tesla invents the polyphase
distribution system in 1888 which is then licensed to Westinghouse
- George Westinghouse was generating AC from the fall
of water (hydroelectricity) as early as 1886 in Great Barrington, Massachusetts.
- The spark-gap experiments of Heinrich Hertz from 1886-1888 leads to the invention of
the first practical radio by Guglielmo Marconi in 1895
- Although there had been numerous attempts to generate DC power (which could not be transmitted more that a few miles) at Niagara Falls, the first
AC power scheme doesn't go online until 1899
- Just as in the previous example with steam, electricity use reaches a critical mass around 1900
- From this point on, electrical power is used to replace human power wherever possible; industrially then domestically
- Anyway, the point to remember is that once human jobs were lots to steam engines, there was no going back.
- New jobs were created for people to manufacture, install then maintain this technology but the number of new jobs never matches the jobs lost.
- Electromechanical Computers (processor generation-1)
- Vacuum tube Computers (processor generation-2)
- The Colossus computer first appears in 1943 Britain as a German code-breaking
- ENIAC was the first vacuum tube created by the Americans in 1946
- Transistorized Computers (processor generation-3)
- The Transistor Computer first prototype was created in
1952 by the University of Manchester
- Boroughs, UNIVAX and IBM all manufacture transistor computers in 1957
- Integrated Circuit Computers (processor generation-4)
- General Electric produces the GE-600 series in 1965 which contains some
- Integrated Circuits help manufactures create the minicomputer industry; up until
then the big players made most their money creating Mainframe Computers
- Microprocessor Computers (processor generation-5)
- The Internet
- With initial funding from DARPA, the very fist appearance of the IP-based
internet occurs in 1969
- This morphs into a second-gen internet in 1982 (now features both UDP + TCP)
- With funding from CERN the first appearance of the
world-wide-web occurs in 1989
- Amazon begins online retailing in 1994
- With funding from a NSF grant, Google
develops their very successful search engine in 1998
- Cloud Computing and Big Data
- In 2003, Google publishes a description of GFS (Google File System) in ACM which triggers the Clouding Computing as described just
- From then until now we have seen the information age transform everything from banks, stock markets, real-estate, retail, whatever. Can anyone
remember what life was like before Google Street View on your Smart Phones?
- Surprisingly, changes in retail purchasing habits is causing the closure of numerous shopping malls around the world; especially in North America
- This has really changed manufacturing where JIT (just-in-time) information technologies ensure that parts are available as they are needed rather
than being warehoused. This includes out-sourcing work to so-called third world locations then using the internet, along with B2B software, to keep
tabs on offshore locations.
- New jobs were created for people to manufacture, install then maintain (including programing) this technology but the number of new jobs
never matches the jobs lost.
so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost jobs" those
politicians are either lying or ignorant of the facts just presented. Truth be told:
- 2 million North American jobs were lost between 1990 and 2000 (due to automation -and- outsourcing)
- 6 million North American jobs were lost between 2000 and 2050 (due to automation -and- outsourcing)
- Desiring self-driving cars, trucks and tanks, DARPA has been funding the development of
self-driving vehicles since 2004. Many industry watchers were convinced that this technology would not be ubiquitous until 2030 but it now
appears that this technology could be ready (in one form or another) by 2020.
- This technology (which will be demanded by the insurance industries) will wipe out employment in the last large group of the unskilled people:
driving cars, driving trucks, driving taxis, driving limos
Epiphany 19: The dominance of C/C++ (part 3/3)
In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside operating
for a moment, those vendors also sold programming tools including programming language compilers like Fortran
(1958) and COBOL
(1959). One of the
most important changes to the computing industry occurred when computer professionals got together to standardize computer languages.
One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC language was once very
popular (I am still forced to use VMS-BASIC every day by my employer in 2019) it has
gone through a very small number of standardizations. This might have something to do
with the fact that many BASIC implementations were so different that standardization was not possible or, perhaps, desirable. On the other hand,
languages like C, C++ and Objective-C have gone though numerous standards and continue to be improved.
For example, non-standard "C" first appeared in 1972 and now referred to as
K&R C after its authors, Brian Kernighan and Denis
Ritchie. Improvements were formalized during then published as C89 in 1989 by ANSI and C90 by ISO. This continued with the
names C99, C11 and C18 as described here.
comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average) whilst C++ moves
to a new level approximately every 3 years (on average)
Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux,
it should be no surprise that it is the Linux community that is pushing newer versions of C and C++
Since gcc is used to build Linux, we should look at this toolset a little closer
- The name game
- Originally, gcc meant Gnu c compiler and could compile both C and C++
programs based upon the file extension. The support was added for Objective-C
- Then GCC (uppercase) was created which means Gnu
- File extensions
- On most systems including Linux, a file extension of ".c" indicates "this is a 'c' source file"
- On most systems including Linux, a file extension of ".cpp" or ".c++" indicates "this is a c++ source file"
- Only on Linux systems have I seen that a file extension of ".C" (uppercase) also indicates "this is a c++ source file"
- On older Linux systems, these commands:
appear to bring up the same executable (at least this is true before gcc-4) but this is not the case since gcc-4.8 (perhaps it is just too difficult
to support C11 and C89 and C99 'c' standards as well as all the new C++ standards in one Swiss-army knife)
- gcc --version
- c++ --version
- g++ --version
- On a modern Linux system (like CentOS-7.5) supporting gcc-4.8 we might see this.
||can support newer language standards
- Most people reading this probably compile their "c" programs using a command-line switch specifying C99 but what do you do if you want to use a
higher standard like C11? If you are running gcc-4.8 then you are good to go (just use the appropriate command-line switch).
But if you are running gcc 4.4 then you might need to go though a couple of other steps: https://edwards.sdsu.edu/research/c11-on-centos-6/
- Over the years, some companies, including Microsoft, have been telling "c" developers to use c++03 rather than compile with c99. (huh?)
- Speaking about GCC for a moment, most people seem to be unaware of this: /usr/bin/gfortran
Epiphany 20: A few thoughts on the 50th anniversary of Apollo 11 landing on the moon
The manned spacecraft programs in both Russia and the United States changed the world in more ways than most people would ever know. First off, humanity
cannot ignore the contribution of the Soviet space program because the United States would have never attempted such a thing if it were
not for cold-war politics
. The plain fact is this: many Americans didn't care about Apollo but did care about beating the Russians at
something. History informs that President Richard Nixon and Henry Kissinger were plotting to terminate the Apollo Program at the very time that Nixon
was congratulating, via telephone from the Whitehouse, Apollo 11
astronaut Neil Armstrong who was
standing on moon (both Nixon and Kissinger thought that Vietnam was a more urgent issue; more on this in a moment). To add insult to injury, many
Americans lost interest after the apparently routine flight of Apollo 12
(e.g. we won that race
twice; time to move on). Proof of this can be seen in the number of citizens who complained to the TV networks about favorite TV programs being
preempted by spaceflight coverage. Heck, in-flight video transmissions from Apollo 13
aired until after a tank explosion placed the mission in jeopardy. Part of this public disinterest led to cancellations
of Apollo flights 18-20
FACT: development of the Apollo Guidance Computer (AGC) was the trigger event
for the largest amount of human technological progress. The design work was done by Draper
Labs at MIT while manufacturing was done by Raytheon. Why was the AGC necessary? Initially, many astronauts and cosmonauts incorrectly thought
that human pilots would be able to directly fly spacecraft much in the same way that pilots flew aircraft. Consider this thought experiment: you are
flying the Lunar Module and need to catch-up-to, then dock with, the Command
Module. For simplicity, assume that both vehicles have identical orbits and velocities but are separated by a distance of 1000 m (3280 f).
Without thinking about life outside of the atmosphere, you fire your RCS
thrusters (force: 100 pounds or 444 Newtons) while aiming at the Command Module. This will increase your velocity which pushes you into a higher
orbit. Your new orbital velocity is faster but your orbital time is now slower. This causes the Command Module to quickly pass under you making it
impossible to dock. One correct solution dictates that you should fire your RCS thrusters away from the target vehicle, which will cause you to drop
into a slightly lower orbit; then wait a short period of time; then fire your RCS thrusters in the opposite direction which should return you to the
original orbit as the CM but hopefully much closer (BTW, first firing forward then quickly firing backward produces the same result). Remember "F =
ma" from Isaac Newton's second law? Since "a = dV/dT" then the second law can
be rewritten as "F = m x dV/dT" which becomes "F x dT = m x dV". (the left side of the equation is known as impulse).
The instantaneous mass of the LM (which decreases every time you fire your thrusters) determines how long you should fire them in every maneuver
(e.g. two one-second thrusts will not produce identical results; a one-second forward burn cannot (exactly) be cancelled by a one-second reverse
burn). These real-time calculus solutions are best determined by a guidance computer because fuel is limited so must be conserved.
How did the development of the AGC improve things here on Earth? First off, commercial mainframe computers in the 1960s were manufactured from
discrete electronic components, including individual transistors and diodes. So when IBM learned that the AGC computer had to fit into a volume the
size of a bread-box (one cubic foot or 28,316 cc) many IBM engineers didn't think it was possible. The Draper/Raytheon solution employed "integrated
circuits" (semiconductor chips containing numerous transistors) which they were already using in a more primitive way inside Polaris missile guidance
systems. The high per-component prices meant that the American government was their primary customer (Apollo consumed 60% of the IC developed by
America in 1966). Because of high cost, government contractors developed semiconductor test methods to ensure that the government would only pay for
components that met design specifications. These testing methods eventually migrated from the customer (government) back to the manufacturing industry
which resulted in affordable chips for the rest of us. That revolution in chip manufacturing produced things like:
- the development of minicomputers (based on chips) that were more powerful and reliable while less expensive than existing mainframe computers
(based upon discrete components)
- early minicomputers acted as packet routers on a prototype internet known as ARPANET
- work on minicomputers caused Bell Labs to create a portable operating system known as UNIX as well as the "C" programming language. After the
breakup of the telephone system in the 1980s, corporate greed caused American UNIX to be replaced with European Linux.
- Moore's Law (see: Epiphany 11 just above)
- the development of microcomputers for use in minicomputers (technological feedback), then the development of workstations which eventually became
just as powerful as minicomputers
- work stations were used to create the first web servers and web browsers when a British contractor, Tim
Berners-Lee, created the world-wide-web while working for CERN in Switzerland.
comment: many Americans today falsely believe that the world-wide-web was an American invention
- microcomputers became the basis for internet packet routers (CISCO Systems) as well
as commercial and residential firewall appliances
- personal computers which eventually became just as powerful as work stations
- gaming requirements triggered the development of graphics cards and GPUs (graphics processing units which act like thousands of special
- gaming consoles (PlayStation, Xbox, etc.)
- game consoles triggered the move from CRT technology to flat screens (the first LCD display was developed for the Sony PlayStation 1)
- development of monochrome and color LCD displays feedback into the computer industry to replace CRT-based monitors with LCD-based
monitors. This business morphs into very-large digital displays in commercial movie theaters which trickles down to large screen TVs in
- phones, tablets, phablets, pads, e-book readers, smart phones, smart TVs all connected via the internet
comment: Apollo used 3-input NAND gates manufactured by Fairchild
Semiconductor. Engineers leave to form Intel then later, AMD
Software jobs also changed drastically during this time. While it is true that high-level programming languages like FORTRAN (1957) and COBOL (1959)
existed, the phrase "computer programmer" did not yet exist as computer programming was primarily done by mathematicians.
High-level languages required more memory and CPU power then what was available on the AGC, but they were employed on mainframe computers used "to run
AGC flight simulations" then "generate the necessary binary code" for the hand-made read-only core
rope memory used to hold AGC programs. The level of redundancy built into the AGC programs (see reference-1) should
inform that these people were doing "computer engineering". Click
Margaret Hamilton to see what I mean.
Critics of Apollo mention that the program was too expensive in that it was consuming too much of the national budget with 4% being the oft quoted
number. Let me remind everyone that cold-war concerns at the time mandated that the Defense budget was kept secret. On top of that, no American
citizen knew how much money was being used to support the Vietnam War. Today we know that the total cost of Apollo (in 1968 dollars) was $25 billion
whilst the cost of the Vietnam War (also in 1968 dollars) was $168 Billion dollars. Now everyone knows that it is way harder to create something than
it is to destroy something so allow me to state the obvious: America got no return on the $168 billion investment. Check out the next chart then advise your political representatives accordingly:
(return on investment)
|Apollo Manned Spacecraft Program
||advances in metallurgy
advances in semiconductor technology
advances in computer engineering
advances in software engineering
invention of the internet
admiration of the whole world
|During the peak years, the Apollo program employed approximately 400,000
scientists, engineers and technicians across 20,000 companies. Much of this work was done by, or managed by, defense contractors.
Many Americans continue to bang on about the USA being a Christian nation but I wonder if they will ever turn their spears into pruning hooks as
is mentioned in Isaiah 2:3–4
||US soldiers killed
||US soldiers injured
||North Vietnamese fighters
||South Vietnamese soldiers
||Civilians in Laos(up to 1973)
||Civilians in Cambodia
contempt of the whole world
|During some peak years, more than 400,000 American soldiers were committed to Vietnam (almost the same number of people tied to the manned
Despite hearing crazy political justifications like "the domino theory", America lost this war but no literal or metaphorical dominos were ever
|First Gulf War
||382 US military
|American defense contractors do well
||first use of "depleted uranium" by the Americans
||American defense contractors do well
||Hopefully everyone reading this knows that "1 trillion" = "1,000 billion"
First use of: Extraordinary rendition
- https://fas.org/sgp/crs/natsec/RS22926.pdf (published by: Congressional Research)
- comment: Since world-war-two ended in 1945, the United States of America has lost every war. But no one seems to care as
long as the economy is humming along.
What About China?
Back in the 1960s, conservatism revolved around business interests which caused some Americans to wonder if China was a missed business opportunity.
This was the main reason for Nixon and Kissinger opening relations with
in 1972. IMHO this trigger event started the Chinese shift from agriculture to industrialism. (think of this as an industrial revolution
confined to one country)
American companies and schools in the 1980s found themselves in the Reagan era of "minimal government" which now goes by the name austerity. This
might have translated into greater economic problems, including unemployment, except for the actions of China's leader, Deng
Xiaoping, who favored "maximal government" so was paying to send thousands of Chinese students to the USA every year to be educated.
I personally experienced this in 1985 Boston: we had morning lectures and
afternoon labs. An English-speaking Chinese student sat one row ahead of me in the lecture hall accompanied by two minders who could not speak English
but were required to pay for student slots (these minders were there to ensure the student would return to China; they passed the day in class by
reading little brown books of political dogma). Back then, Americans correctly welcomed these foreign students (it was a business opportunity) but no
one ever thought that China would eventually compete head-to-head with the USA. I applaud the Chinese students who were able to acquire an education
in a foreign country speaking a foreign language but wonder how many Americans would be willing, or able, to do the same by travelling to China.
- China is investing in their citizens while America is investing in their military which makes me
wonder if America will, one day, recognize "an education gap" much in the way they became mobilized by the Soviet launch of Sputnik 1 in 1957. Many IQ charts (including these two: here
and here) show average IQ levels in Hong Kong and Singapore
(score: 109) are a full eleven points ahead of the USA (score: 98). Also notice that countries with high levels of religious
extremism tend to have lower IQs.
- since the 1980s, China has moved more
than 300 million Chinese citizens from the poor agricultural class into the middle class. This is approximately the same size of the total
population of USA where "middle class" is becoming a fading memory.
- Since American educators showed China how to modernize, I can't seem to stop thinking about the story of "the Tortoise and the Hare". If Charles
Dickens were alive I am certain he would be toying with writing a book titled "A Tale of Two Countries"
Steps to MAGA (Make American Great Again)
- ELIMINATE OFFENSE SPENDING
- The colored chart (28-lines above) provides proof that many millions of people would be alive today if the USA had not been funding
foreign wars on foreign shores. One way to make American great again is to stop funneling tax payer money into those defense programs which, in
realty, are offense programs used to support the American empire
- FUND SCIENCE and TECHNOLOGY
- Like an addict that is unable to quit cold-turkey, defense contractors will probably not be able to survive having their government funds
reduced. But just as happened during the 1950s and 1960s, defense contractors could be employed by the government to do big science projects
though civilian agencies like NASA
- During the 1990s, the American defense budget
was always under $300 billion per year. By 2010 the defense budget had climbed to $721 billion per year. Meanwhile, NASA's budget
has fluctuated between $17 to $20 billion since 2006. If NASA's budget was doubled by diverting money from the defense budget would the Pentagon
even notice? And yet we know that spending on NASA will have a positive ROI (return-on-investment)
- DARPA is on American defense investment with a very high ROI
- FREE EDUCATION
- There are still many people alive today who will tell you that they attended Harvard University in the 1950s and only paid $100.00
- Before defense contractors starting receiving the bulk of government monies, the American government used to subsidize all college and
university educations. Funding for education slowly dried up as money was diverted into the defense/offense budgets.
- Once money is diverted from offence back into education, the whole economy will repair itself
Epiphany 21: Microsoft is promoting Python3 (Python version 3.x)
2020-05-06: Today’s alert from ZNET informs that Microsoft has added 51 Python videos to their previous 44. I read somewhere that because Bill Gates had
started programming in BASIC, that Microsoft could never really get rid of it (other than the massive changes to "visual studio" between VS6 and VS.net
where they made it play-nice with C/C++ and anything else that generated code under the .net framework). I wonder what Bill would say about the shift to
“Python for beginners”
playlist from Microsoft (this is the original 44)
“More Python for beginners”
- video #4 shows how to configure Video Studio Code which can now be acquired and used FREE of CHARGE
- Video Studio Code can integrate with a GIT repository including GitHub
(this is the new 20)
“Even More Python for Beginners”
(this is the new 31)
Back to Home
Waterloo, Ontario, Canada.