Technological Change (in my life) part-2
edit: 2023-05-20 (this is a stream-of-consciousness thing)
Epiphany 10: Smart Phones are really Star Trek devices
- The title says it all; today's Smart Phones are the equivalent of Star Trek Tricorders
merged with Start Trek Communicators but we don't use them in the same way as Mister Spock
- Google maps allows us to navigate without building an mental model of our neighborhoods but we are now quickly lost
without our phones
- I'm certain Mister Spock was never "playing Pokémon Go" or
"checking his likes on Facebook" so humans have a long way to go
- And who ever thought it made sense to drive an automobile while texting? (more accidents today are caused by
texting-while-driving than drunk-driving)
Epiphany 11: "Moore's Law Revisited" and the 'Big Data' Revolution
The following was partly inspired by the book Thank You for Being Late
(2016) by Thomas
- Everyone has heard the (apocryphal) story about the king who was so grateful for the game of chess, that he offered the
inventor any reward
- The game developer asked for payment in "grains of rice" based upon the following progression over a 64-square board: one
grain for square one, two grains for square two, four grains for square three, eight grains for square four, continued up to 64
- Realizing that the reward would break the bank, the king has the inventor killed causing nerds to remain quiet ever since
The Integrated Circuit (chip)
- The 'integrated circuit' was first conceived and implemented by two different people working at competing companies:
- 1958: Jack Kilby at Texas Instruments (TI)
- 1959: Robert Noyce at Fairchild Semiconductor (now Intel
- Wikipedia mentions others but TI and
Fairchild were the first to manufacture a product
- Some semiconductor components act as "resistors, capacitors, and conductors" so actual transistor count does not begin until
- Fairchild first releases a commercial 2-transistor chip for sale in 1961
- In late 1964 Gordon Moore was approached by Electronics Magazine
to do an article which would be published in 1965 (the 35th anniversary edition) speculating where the chip industry may be
headed over the next 10 years.
- Fairchild had just finished 1964 making 16-component chips and were planning to double the number in 1965
- Moore had noticed this doubling and so predicted (in the article) that doubling would continue each year for a decade
- This became known as Moore's Law
- read the original article
here and here and here
- In 1975 he revised his formula so that doubling would continue every two years
- the chart above-right shows transistors doubling every two years starting in 1971
- In 2004 executives at Intel realized that continuing die-shrinks of the x86
P6-micorarchiteure were requiring an exponential amount of human effort in order to gain only a
linear increase in CPU throughput (part of this problem was due to the amount of heat). So they scrapped all
on-going P6 development replacing it with
Intel Core microarchitecture which would facilitate multiple cores rather than a faster single CPU.
- In 2012 Intel announced the 8-core 64-bit Itanium 9500 built from 3.1 billion transistors. (made in the USA)
- in 2019 AMD announced that their Epyc Rome 64-bit CPU built from 39.5 billion transistors (made in Taiwan)
The Speed of Change
In response to the 1992 Russo-American moratorium on nuclear testing, the US government started ASCI (Accelerated Strategic
Computing Initiative) in 1996 to provide a method for developing nuclear weapons in simulation. ASCI Red was delivered in 1997 and
could execute 1.8 TeraFLOPS. It cost $55 million, was a little smaller than a tennis court, and required the equivalent power of 800
homes. In 2006 Sony released the PS3 which could also execute 1.8 TeraFLOPS at a cost $300 but only required the equivalent power of
three 120W light bulbs.
Excerpt page-41 "Thank You for Being Late" (2016) Thomas Freidman
Big Data Revolution of 2006-2008
- In 2003 Google published a paper on the Google File System
(GFS). You can read the ACM paper
here or here
- This stimulates others to develop Big Data technology to allow data
center capacity to grow so large that you would never need to ever delete a file
- According to Friedman, Moore's Law combined with Big Data stimulated a lot of
change in 2007 but these events were obscured by the financial meltdown of 2007-2008
- Activities (2005-2010) include:
||Amazon Web Services
||a 140-character micro-blogging site
||moves from campus-wide to world-wide
||Apple announces the iPhone
||Apple releases the iPhone
||Google promises a gPhone
|Google acquired Android
||offers a tiny Linux for the gPhone gPhone rebranded Android free to members of the
Open Handset Alliance
|Google acquired YouTube
|Google turned up Street View
||no more street maps
|LinkedIn hits 10 million members
||repository of open source software
||NASA co-creates OpenStack
||put into the public domain (just check your Linux distro)
- The combination of raw computing power along with big data facilitated inventions like: IBM's Watson, Amazon's
Alexa, AWS (Amazon Web Services), not to mention self-driving cars
- Question: So what was so special about 2007?
Answer: Exponential growth (related to Moore's Law) allowed industry to almost fill the first half
of the chessboard with transistors
- using the formula: count = 2^(square-1) we get:
||doubles yearly until 1971
||Gordon Moore predicts Moore's Law
||Article in Electronics Magazine
||last square of the first half of the chessboard
||first square of the second half of the chessboard
||the far future provided Moore's Law can continue
Epiphany 12: GPU's and Moore's Law
The original definition of Moore's Law stated that the number of transistors on a single chip would "double every year" while
"costing the consumer the same amount".
note: after the first decade Gordon Moore revised the doubling period to every 2 years).
I remember reading articles in the mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were
attributed to manufacturing problems related to "the limits of photo lithography" and "gate sizes so small that electrons would be
able to quantum tunnel across barriers thus rendering them conductors rather than semi-conductors"
Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they
charged more money. Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing
the consumer the same amount". So now we have an economic Moore's Law as well as a technical one.
Advances including FinFET
(GAA) technology, and 3D
have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of
the fact that the computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from
Nvidia have beat the technical Moore's Law fore the last few product iterations.
Nvidia science cards (graphics cards without a video connector) break Moore's Law every year
Epiphany 13: The Second Renaissance (Storm before the Calm?)
was a period in European history, from the 14th to the 17th century, regarded as the cultural
bridge between the Middle Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in
- Before Johannes Gutenberg introduced printing to Europe in
1439, it is safe to say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology
enabled an alternative view of Christianity to flourish which we now refer to as the
protestant reformation. Many people thought it was their duty to reject this change which eventually plunged Europe into
the Thirty Year's War (1618-1648). While many Europeans died
unnecessarily, European culture survived. The main take away from Gutenberg's technology is this: because many more people were
able to get access to books, many learned to read. Some even learned to question church authority.
- Galileo Galilei is best known for employing a telescope (a.k.a.
Dutch Spy Glass) to observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes
version of the story has him going up against the Vatican who clung to their belief that every heavenly body orbited Earth which
was located at the center of god's universe. Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live
out the remaining years of his life under house arrest (he was too well known to be put to death). YOU WOULD THINK that Galileo
could have better defended himself by convincing his inquisitors to look into the telescope eyepiece while it was pointed at the
moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device was the work of the devil.
It needs to be pointed out that without a telescope to extend human senses, it is very difficult to
determine if "the sun orbits the earth" or "the earth orbits the sun" (although non-telescopic measurements by Tycho
Brahe came very close but you needed to be a mathematician to interpret the results). This is the beginning of an era
where technology was used to extend the limits of the human senses.
- Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope
which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to
extend human senses into a new realm.
- There have been countless improvements in telescopes and microscopes since those days but it would appear that humanity had
hit a new limit. I say "had" because the inventions of the smaller while more powerful computers, along with computer networking
via the internet, have ushered in a new age of astronomy and biology where we have put electronic eyes on our machines. For
example, computers associated with the Kepler Spacecraft
are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers,
robotic sequencers, and the internet enabled the first two Human
Genome Projects as well as the Human Proteome Project
- The internet actually gets its start in 1969 as ARPAnet but is never
taken seriously by the world until 1991 when Tim Berners-Lee
invented the World Wide Web to solve a document sharing problem at CERN. (yep, the American military invented the internet but European scientists
made it useful to the world; It would appear that this new renaissance began in Europe as well)
- Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita
basis) than did the efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
- The downside of the internet is that it has facilitated communications between crazy people and their conspiracy theories. It
has also facilitated a large crop of radio-like internet broadcasters which further divided the population into political
tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election of a real estate tycoon for
President of the United States. On the flip side, society might not be as divided as we think: While the political left dukes it
out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only
represent 25% each.
- This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries
ago. While some people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty
Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of
non-violence; not war or inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim
Berners-Lee invented the web, we don't need to engage in a thirty year's war to put an end to our social madness. Just as
Gutenberg introduced technology which enabled the advance of science while (eventually) reducing the importance of religious
dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing of the past.
Epiphany 14: Something in the water of Nordic-Scandinavian countries?
Epiphany 15: Big Data renders first-past-the-post voting as irrelevant
This next section requires the reader to have some familiarity with a few terms
- Demographics and Demographic Analysis
- Big Data
Facebook and 'Cambridge Analytica' - a summary of what happened
Wikipedia: American presidential
election of 2016
Wikipedia: BREXIT vote of 2016
Comments - Observations
- Facebook sold access to their customer data. One of their
business partners included Global Science Research
- Social media outlets have always done this. It is how you get to use their platform free-of-charge.
- This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements
which most people never read
- In 2014, Global Science Research co-director Aleksandr Kogan created a Facebook app / questionnaire called This Is Your Digital Life
- in all versions of this story, this app was also used to collect your Facebook "Likes and Dislikes"
- in many versions of the story, this app also collected the "Likes and Dislikes" of your FB friends
comment: since those secondary level people were never presented with an acceptance agreement, this
seem unethical if not illegal
- Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica, a data mining firm:
- who sold information to AggregateIQ which was hired by the Leave
Campaign prior to the British BREXIT referendum
- now known to be working for the 2016 political campaigns of Ted Cruz and Donald Trump
- The results of the voluntary questionnaire were run through a psychographic analysis which resulted in
the Facebook participants being slotted into 27 categories
- It is now believed that Cambridge Analytica collected data on 87 million Facebook users
- Some of the categories identified Facebook people who...
- never vote
- always vote Republican (or conservative)
- always vote Democratic (or liberal)
- are political centrists -and- who might be convinced to vote one way or the other
- This last group was targeted with advertising (which may have masqueraded as fake
news) with the intent of convincing some to vote in the desired direction (pro-Trump or pro-BREXIT) or just stay home.
This work well in any a first-past-the-post (winner-take-all) election or plebiscite.
- Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics,
democracies need to shift to proportional representation.
- Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.”
- I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans do this they will be blind to
the effect of social media in the next election
Epiphany 16: Industrial/Technical Evolution (part-1)
Many associate Industrial Revolution with the years 1760-1860
but this might be an oversimplification if you consider that it began with the age
of steam, then transitioned to the age of electricity, then transitioned to the age of information. Many associate the information
age with computers but his misses the point that information with earlier electrical technologies (e.g. telegraph,
telephone, radio, television, cable television, internet) or non-electrical technologies (e.g. scrolls, books, newspapers)
The Evolution of Locomotives
steam (wood + coal) >> petroleum >> electricity
Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn wheels. Europe
quickly experienced wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. It is now
known that humans burned petroleum for over 5,000 years but it wasn't until the mid-1800s
that the industrial revolution commercialized petroleum extraction and refinement.
Steam locomotives eventually morphed into diesel locomotives where
the fuel is burned in combustion engines to directly operate pistons (e.g. no intermediate steam is required). But the immense
power was difficult to control via a mechanical transmission so diesel locomotives morphed into
diesel electric systems where "a diesel engine runs an electrical generator" which is then used to power electrical motors.
At this point you can see that if an external electricity source is available then a locomotive might be made much more efficient
(lighter) by doing away with the diesel engine and electrical generator.
The Evolution of Computers
- Before computing machines, Computer was label was reserved for people who did calculations
- Electro-mechanical computers were popular in the 1940s (using relays developed for the telephone industry)
- One good example of this is the American computer known as ENIAC
- Tommy Flowers was an engineer for the British Post Office (which
ran the British telegraph and telephone systems) who helped build Colossus.
Contrary to popular belief, Colossus was a hybrid system mostly comprised of electrical relays although vacuum tubes (which
the Brits correctly refer to as Thermionic Valves) were installed in key places of the CPU.
- Vacuum tube based computers were popular in the 1950s
- Transistor-based computers were popular in the 1960s
- Integrated circuit-based computers became popular in the 1970s
- Integrated circuits have morphed from small scale integration, to medium scale integration (MSI), to large scale
integration (LSI) to very large scale integration
also known as (VLSI)
- Some examples of MSI + LSI are chips known as ALUs (arithmetic logic units) but chip evolution eventually reached a stage
where a whole CPU could be build using one chip, and that chip was known as the Intel
- Personal Computers (short list):
- mostly skipping over a discussion of industry's shift from CISC to RISC
- the internal CISC/RISC wars were reminiscent of the Holy Crusades (lots of religious dogma)
- RISC won out and the companies that didn't adopt it went out of business except Intel
- I still don't know if x86-64 is CISC or RISC; Intel says it is RISC but the base instructions are the same CISC
instructions from the 8086; Intel says they decompile CISC instructions into RISC instructions (which are run on an internal
RISC machine) but this sounds an awful lot like minicomputers of the 1970s and 1980s that translated CISC instructions into
microcode which was run through a sequencer (eg. open one or more data paths; move the data; close the paths; strobe the
- both x86 and x86-64 supported streaming instructions (single-instructions :: multiple-data) which, IMHO, are definitely
- computers get really fast when streaming instructions are moved onto external graphics cards
- The first computer I ever used was an Interdata Model 70. That particular installation had no operating system which
meant the program ran on bare metal (er, silicon) after being fetched from a cassette tape drive. In those days, programmers
who wrote stand-alone applications directly communicated with their own device (e.g. no device drivers). Imagine the
difficulty in writing your own routines for data-error-detection and data-error-correction.
- Since the Interdata
Model 70 was an architectural clone of the IBM-360 it meant that third-party software libraries could be purchased
which provided ready-built routines for communicating with various devices including: serial interfaces (printers and
terminals) as well as tape storage devices like cassettes and 9-track
tape decks. These software libraries coexisted with standalone programs and were informally known as Tape
Operating Systems (TOS)
- On the next iteration, computer manufacturers themselves provided (for a fee) Disk Operating Systems
(DOS) with numerous predefined libraries so it was no longer necessary for programmers to reinvent the wheel for every
computer solution under consideration. After all, who would know better how to communicate with a disk drive than the
engineers who built it?
- This name is a bit of a misnomer since a DOS could also support all the devices before it including block-structured tapes
- All manufacturers inserted network routines into their DOS systems which would allow programmers to utilize the
manufacturer's proprietary networking protocols. But I never heard the phrase Network Operating System
(NOS) until SUN Microsystems published SunOS/Solaris which had built-in support for TCP/IP (The Internet) as well as support
for disks and tapes.
- This name is a bit of a misnomer since a NOS could also support all the devices before it including disks and tapes
- Most people realized that networking improved human-to-computer connectivity but a few visionaries realized that computers
could now connect to other computers.
- Cloud computing is all the rage this side of y2k with numerous definitions of what is meant by that phrase. The most
popular one is that, like a cloud, humans are not really certain where they are currently working (could be any bunch of
computers in a data center; or could be in any one of a number of data centers around the world; no need to worry about a
computer blowing a power supply or a drive going bad because their are redundant copies of everything everywhere; hardware
is now so cheap that it makes more sense to protect the user's data). One software paradigm called IaaS
(infrastructure as a service)
can be implemented with a free product called OpenStack (this is just
one example of many)
- Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software
together are multiplicative (think: logarithmic on steroids)
- Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44
employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry
would never get over the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor
- Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their
favorite computer hardware and OS
Epiphany 17: What is old is new again (Python vs BASIC)
- I just learned (mid-2018) that Python is now a
server-side, Python can also be used interactively as well as standalone on almost every computer platform in use today.
- Python is an interpreted fourth-generation language meant to replace BASIC
(a third generation language which began by being mostly interpreted but today is mostly compiled).
- Why is Python not compiled?
- computer systems this side of y2k are so fast that there is little difference between interpreted and compiled "for some
- (send a command to the internet then wait; receive some data then send a command to a relational database then wait;
in this instance Python acts more like the conductor of an orchestra)
- many cloud computing environments (known as heterogeneous clouds) are comprised of different CPU technologies and/or
operating systems. For your web application to run anywhere you would either "need precompiled binaries for each CPU flavor"
(bad) or "would need to run an interpreted script" (good)
- Python is already very fast and I don't know why or how (kind of reminds me of some
Forth programming I did in the 1980s)
- Humorous Observation:
- Before I started my computing career, I learned interpreted BASIC on: Heathkit
H8 , Apple II and TRS-80
(each implementation was different)
- Moving to compiled languages (COBOL, FORTRAN, Pascal, HP-BASIC, VMS-BASIC, C, C+) showed the true
power of computers
- I am ending my career learning interpreted Python 3.6 on Linux (CentOS-7) as well as Windows-10
- Unlike the "many chefs" debacle 40 years ago with BASIC, Python development is in the hands of one organization (The
Python Software Foundation) which means that we will only see one flavor of the Python language although there are
already two. Be aware of the fact that (in 2019) many internal Linux utilities (like YUM and FIREWALL-CMD) depend upon
Python-2.7 which means that sloppy additions of Python-3 to your system will break Linux utilities as
I have documented here (yes, multiple instances of Python can be supported on the same Linux system if you are careful
-AND- if your Python scripts begin with a shebang)
- Noodling Around on your computer
- IMHO, the personal computer revolution of the 1970s and 1980s was important to our society because it allowed
non-specialists to noodle around with ideas in BASIC. Anyone who has tried to do serious work in BASIC will soon learn about
implementation limitations (e.g. maximum size of integers or limited precision of floats). Moving to other languages is not
much different. For example, it is almost impossible to write a BASIC demo for the
Diffie-Hellman Key Exchange and difficult to do it in C without resorting to non-standard libraries. But check out
these hacks in Python 3.7
- click here to see my Python hacks involving DFT + FFT
# author : Neil Rieck
# created: 2019-08-22
# purpose: demo to show that Python is better than BASIC
# (and most other languages) for serious work and/or noodling
import math # math library
print("2^32 :",2**31) # 32-bit signed int (BASIC fails)
print("2^64 :",2**63) # 64-bit signed int (BASIC fails)
print("2^128 :",2**128) # difficult in many languages
print("2^256 :",2**256) # ''
print("2^512 :",2**512) # ''
print("2^9999 :",2**9999) # ''
print("2^99999:",2**99999) # works in Python
pi : 3.141592653589793115997963468544185161590576171875
1/3 : 0.333333
2^32 : 4294967296
2^64 : 18446744073709551616
2^128 : 340282366920938463463374607431768211456
2^256 : 11579208923731619542357098500868790785326998466...
2^512 : 13407807929942597099574024998205846127479365820...
2^9999 : 99753155844037919244187108134179254191174841594...
Note: "..." is where I chopped the line for this display
Epiphany 18: Industrial/Technical Evolutions (part-2)
- While there are no official dates, it appears to me that industrial revolutions happen almost every hundred years starting
between 1760 and 1765 (on average)
- It goes without saying that each revolution merges with previous revolutions (heavier-than-air travel first occurred in 1903
which firmly in the middle of the second industrial age)
- will the third industrial age end earlier as we now enter the age of biology -OR- is biology jut another form of information
technology which is much easier to research with computers, robotic sequencers, and the internet?
- Companies like Google and Amazon Web Services are now more valuable than their fossil fuel equivalents which is proof that
Information is more valuable than energy
The First Industrial Age (Age of Steam) years: 1760-1860
- History tells us that steam engines existed before 1800 but many visionary's saw what was coming. For example, Adam Smith
worried about what might happen when machines replaced people. He advocated government-run schemes to cut weekly hours-of-work
(which would put more people to work) as well as reeducation programs. These were some of the reasons for writing his two main
- The Theory of Moral Sentiments in 1759
- An Inquiry into the Nature and Causes of the Wealth of
Nations in 1776 (same year as the American Revolution)
- James Watt introduced his improvement to the steam engine in 1776 which
made more efficient use of coal as a water-pumping device in mines. Note that this modification made steam engines more fuel
efficient, thus practical.
- William Murdoch writes a patent
application on the steam locomotive in 1784
- So why do historians pick a general starting year of 1800? I think that this is when a particular technology reaches a
critical mass where it affects most of society.
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
- Caveat: in 1859, petroleum was discovered in Petrolia
Ontario and Titusville
Pennsylvania. These discoveries enabled new locomotive designs to flip from steam to the direct burning of petroleum.
Gasoline (one a waste product of petroleum refinement) enables the automotive industry. This second wave of powered machines
occurs in parallel with the age of electricity.
The Second Industrial Age (Age of Electricity) years: 1860-1960
- The first time an electrical telegraph was
suggested was 1753 and yet biographies of Michael Faraday and James
Clerk Maxwell makes me wonder if anyone had a clue before 1880. For example, Georg
Ohm published Ohm's Law in 1827 and yet, many people who thought
they knew better were responsible for damaging underwater communication cables as late as 1858 as documented
comment: If those know-it-alls had ever bothered to learn Ohm's Law while developing a working knowledge
of electricity, they would have never cranked up the voltage (which resulted in a breakdown of the gutta-percha insulation.
- Alexander Graham Bell and Elisha
Gray co-invent the telephone in 1876
- Thomas Edison was generating DC power from the combustion of coal as
early as 1882
- Nikola Tesla invents the polyphase
distribution system in 1888 which is then licensed to Westinghouse
- George Westinghouse was
generating AC from the fall of water (hydroelectricity) as early as 1886 in Great Barrington, Massachusetts.
- The spark-gap experiments of Heinrich Hertz from 1886-1888 leads
to the invention of the first practical wireless radio by Guglielmo
Marconi in 1895
comment: field-theory by Maxwell was never taken seriously until Hertz published the results of his
- Although there had been numerous attempts to generate DC power (which could not be efficiently transmitted more than a few
dozen miles) at Niagara Falls, the first AC power scheme doesn't
go online until 1899. This changes everything.
- Just as in the previous example with steam, electricity use reaches a critical mass around 1900
- From this point on, electrical power is used to replace human power wherever possible (industrially then domestically)
- This side of 1900, radio is developed as an analog audio delivery technology based upon modulation schemes: AM, FM, and PM
- This technology is further developed as analog television (black-and-white then color)
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
The Third Industrial Age (Age of Digital Data / Age of Information) years: 1960-2010?
- Electromechanical Computers (processor generation-1)
- Vacuum tube Computers (processor generation-2)
- The Colossus computer first appears in 1943 Britain as a
German code-breaking tool
- ENIAC was the first vacuum tube computer created by the Americans in
- Transistorized Computers (processor
- The Transistor Computer first
prototype was created in 1952 by the University of Manchester
- Boroughs, UNIVAX and IBM all manufacture transistor computers in 1957
- Integrated Circuit Computers (processor generation-4)
- General Electric produces the GE-600 series in 1965 which
contains some ICs.
- Integrated Circuits help manufactures create the minicomputer
industry; up until then the big players made most their money creating Mainframe
- Microprocessor Computers (processor generation-5)
- Bell Labs starts the UNIX OS project in 1969
- The Internet
- With initial funding from DARPA, the very fist appearance of an IP-based
internet occurs in 1969
- This morphs into a second-gen internet
in 1982 (now features both UDP + TCP)
- With funding from CERN the first appearance of the
world-wide-web occurs in 1991
- Amazon begins online retailing in 1994
- With funding from a NSF grant, Google
develops their very successful search engine in 1998
- Cloud Computing and Big Data
- In 2003, Google publishes a description of GFS (Google File System) in ACM which triggers Clouding Computing as described
- From then until now we have seen the information age transform everything from banks, stock markets, real-estate, retail,
whatever. Can anyone remember what life was like before Google Street View on your Smart Phones?
- Surprisingly, changes in retail purchasing habits is causing the closure of numerous shopping malls around the world;
especially in North America and Europe
- This has really changed manufacturing where JIT (just-in-time) information technologies ensure that parts are available as
they are needed rather than being warehoused.
- Also, many IT jobs were out-sourced to so-called third world locations then using the internet.
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost
jobs" those politicians are either lying or ignorant of the facts just presented. Truth be told:
- 2 million North American jobs were lost between 1990 and 2000 (due to automation -and- outsourcing)
- 6 million North American jobs were lost between 2000 and 2005 (due to automation -and- outsourcing)
Speculation: The Fourth Industrial Age (Artificial Intelligence, deep learning, and more?)
- Desiring self-driving cars, trucks, tanks and drones, DARPA
has been funding the development of
self-driving vehicles since 2004. Many industry watchers were convinced that this technology would not be ubiquitous until
2030 but it now appears that this technology could be ready (in one form or another) by 2020 partly due to advancements in "A.I.
and deep learning"
comment: Since 94% of all auto accidents are caused by human error, this technology (which will be
demanded by the insurance industries) will wipe out employment in the last large group of unskilled people: car drivers, truck
drivers, taxi drivers, limo drivers and bus drivers
- The benefits of "A.I. and deep learning" have benefited most internet users since 2010 although most people have no idea this
invisible revolution has been taking place with the biggest application being human language translation (improvements are ever
- Most non-nerds ignored Google's 2012 "A.I. and deep learning" announcement about being able to use computers to locate which
YouTube videos contained pictures of a cat (or not)
- Most A.I. researchers claim they always knew that A.I. could someday easily beat a grand master Chess but never beat anyone at
the game of Go. So most non-nerds just shrugged when DeepMind
(a division of Google's parent company, Alphabet) announced in 2015 that their "A.I. and deep learning" program named AlphaGo
had just beat the world champ in Go
- Two years later in 2017, DeepMind announced AlphaZero
variations of their Chess and Go solutions. Here the word Zero means "zero knowledge from humans" (other than the simple rules
of the games). You program two similar systems then allow them to play each other many millions of times per night (in this way
they gain experience at inhuman speeds)
- During Tesla's "A.I Day" (2021-08-19) it was announced that Tesla was moving training of their vehicle's A.I. system into a
game-world where many millions of miles of driving experience will be gained without any loss of human life. Sounds not unlike
what AlphaZero was doing with Go and Chess.
Epiphany 19: The dominance of C/C++ (part 3/3)
In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside operating
for a moment, those vendors also sold programming tools including programming language compilers like Fortran
(1958) and COBOL
(1959). One of the most important changes to the computing industry occurred when computer professionals got together to standardize
One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC
language was once very popular (I am still forced to use VMS-BASIC
every day by my employer in 2019) it has gone through a very small number
of standardizations. This might have something to do with the fact that many BASIC implementations were so different that
standardization was not possible or, perhaps, desirable. On the other hand, languages like C, C++ and Objective-C have gone
though numerous standards and continue to be improved.
For example, non-standard "C" first appeared in 1972
and now referred to as K&R C after its authors, Brian
Kernighan and Denis Ritchie. Improvements were formalized then published as C89 in
1989 by ANSI and C90 by ISO. This continued with the names C99, C11 and C18 as described
comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average)
whilst C++ moves to a new level approximately every 3 years (on average)
Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux,
it should be no surprise that it is the Linux community that is pushing newer versions of C and C++
Since gcc is used to build Linux, we should look at this toolset a little closer
- The name game
- Originally, gcc meant Gnu c compiler and could compile both
C and C++ programs based upon the file extension. The support was added for Objective-C
- Then GCC (uppercase) was created which means Gnu
- File extensions
- On most systems including Linux, a file extension of ".c" indicates "this is a 'c' source file"
- On most systems including Linux, a file extension of ".cpp" or ".c++" indicates "this is a c++ source file"
- Only on Linux systems have I seen that a file extension of ".C" (uppercase) also indicates "this is a c++ source file"
- On older Linux systems, these commands:
appear to bring up the same executable (at least this is true before gcc-4) but this is not the case since gcc-4.8 (perhaps it
is just too difficult to support C11 and C89 and C99 'c' standards as well as all the new C++ standards in one Swiss-army knife)
- gcc --version
- c++ --version
- g++ --version
- On a modern Linux system (like CentOS-7.5) supporting gcc-4.8 we might see this.
||can support newer language standards
- Most people reading this probably compile their "c" programs using a command-line switch specifying C99 but what do you do if
you want to use a higher standard like C11? If you are running gcc-4.8 then you are good to go (just use
the appropriate command-line switch). But if you are running gcc 4.4 then you might need to go though a
couple of other steps: https://edwards.sdsu.edu/research/c11-on-centos-6/
- Over the years, some companies, including Microsoft, have been telling "c" developers to use c++03 rather than compile with
- Speaking about GCC for a moment, most people seem to be unaware of this: /usr/bin/gfortran
One thing usually overlooked in this ecosystem begins with the Unix
Make utility first written in 1977 by Stuart Feldman of Bell
Labs. This was the easiest method to get an application running on another Unix implementation which may be running on a
completely foreign computer architecture. I'm jumping a lot of steps here but after a time, SUN Microsystems thought that Make was
getting a little long-in-the-tooth so developed a product called "Solaris Package Manager". It should surprise no one that Red Hat
needed something in 1995 to move to Red Hat Linux 2 so further developed the SUN product calling it "Redhat Package
Manager" (file extensions are '.rpm')
Epiphany 20: The Apollo 11 moon landing (50 years later)
Back to Home
Waterloo, Ontario, Canada.