Technological Change (in my life) part-2
edit: 2022-07-10 (this is a stream-of-consciousness thing)
Epiphany 10: Smart Phones are really Star Trek devices
- The title says it all; today's Smart Phones are the equivalent of Star Trek Tricorders
merged with Start Trek Communicators
- We don't use them in the same way as Mister Spock because they appear to be making us stupid
- Why remember anything when you can look it up on the internet?
- Google maps allows us to navigate without building an mental model of our neighborhoods but we are now quickly lost
without our phones
- I'm certain Mister Spock was never "playing Pokémon Go" or
"checking his likes on Facebook" so humans have a long way to go
- And who ever thought it made sense to drive an automobile while texting on their phones? (more accidents today are caused
by texting-while-driving than drunk-driving)
Epiphany 11: "Moore's Law Revisited" and the 'Big Data' Revolution
note: The following was partly inspired by the book
Thank You for Being Late (2016) by Thomas
Freidman
The Chessboard
- Everyone has heard the (apocryphal) story about the king who was so grateful for the game of chess, that he offered the
inventor any reward
- The game developer asked for payment in "grains of rice" based upon the following progression over a 64-square board: one
grain for square one, two grains for square two, four grains for square three, eight grains for square four, continued up to 64
- Realizing that the reward would break the bank, the king has the inventor killed (causing nerds to remain quiet ever since)
The Integrated Circuit (chip)
- The 'integrated circuit' was first conceived and implemented by two different people working at competing companies:
- 1958: Jack Kilby at Texas Instruments (TI)
- 1959: Robert Noyce at Fairchild Semiconductor (now Intel
Corporation)
- Wikipedia mentions others but TI and
Fairchild were the first to manufacture a product
- Some semiconductor components act as "resistors, capacitors, and conductors" so actual transistor count does not begin until
1960
- Fairchild first releases a commercial 2-transistor chip for sale in 1961
- In late 1964 Gordon Moore was approached by Electronics Magazine
to do an article which would be published in 1965 (the 35th anniversary edition) speculating where the chip industry may be
headed over the next 10 years.
- Fairchild had just finished 1964 making 16-component chips and were planning to double the number in 1965
- Moore had noticed this doubling and so predicted (in the article) that doubling would continue each year for a decade
- This became known as Moore's Law
- read the original article
here and here and here
- In 1975 he revised his formula so that doubling would continue every two years
- comments:
- the chart above-right shows transistors doubling every two years starting in 1971
- In 2004 executives at Intel realized that continuing die-shrinks of the x86
P6-micorarchiteure were requiring an exponential amount of human effort in order to gain only a
linear increase in CPU throughput (part of this problem was due to the amount of heat). So they scrapped all
on-going P6 development replacing it with
Intel Core microarchitecture which would facilitate multiple cores rather than a faster single CPU.
- In 2012 Intel announced the 8-core 64-bit Itanium 9500 built from 3.1 billion transistors. (made in the USA)
- in 2019 AMD announced that their Epyc Rome 64-bit CPU built from 39.5 billion transistors (made in Taiwan)
The Speed of Change
In response to the 1992 Russo-American moratorium on nuclear testing, the US government started ASCI (Accelerated Strategic
Computing Initiative) in 1996 to provide a method for developing nuclear weapons in simulation. ASCI Red was delivered in 1997 and
could execute 1.8 TeraFLOPS. It cost $55 million, was a little smaller than a tennis court, and required the equivalent power of 800
homes. In 2006 Sony released the PS3 which could also execute 1.8 TeraFLOPS at a cost $300 but only required the equivalent power of
three 120W light bulbs.
Excerpt page-41 "Thank You for Being Late" (2016) Thomas Freidman
Big Data Revolution of 2006-2008
- In 2003 Google published a paper on the Google File System
(GFS). You can read the ACM paper
here or here
- This stimulates others to develop Big Data technology to allow data
center capacity to grow so large that you would never need to ever delete a file
- According to Friedman, Moore's Law combined with Big Data stimulated a lot of
change in 2007 but these events were obscured by the financial meltdown of 2007-2008
- Activities (2005-2010) include:
Year |
Activity |
Notes |
2005 |
YouTube |
|
2006 |
Amazon Web Services |
|
|
Twitter |
a 140-character micro-blogging site |
|
Facebook |
moves from campus-wide to world-wide |
2006 |
Apple announces the iPhone |
|
2007 |
Apple releases the iPhone |
Google promises a gPhone |
|
Google acquired Android |
offers a tiny Linux for the gPhone gPhone rebranded Android free to members of the
Open Handset Alliance |
|
Google acquired YouTube |
|
|
Google turned up Street View |
no more street maps |
|
LinkedIn hits 10 million members |
|
2008 |
GitHub |
repository of open source software |
2010 |
NASA co-creates OpenStack |
put into the public domain (just check your Linux distro) |
- The combination of raw computing power along with big data facilitated inventions like: IBM's Watson, Amazon's
Alexa, AWS (Amazon Web Services), not to mention self-driving cars
- Question: So what was so special about 2007?
Answer: Exponential growth (related to Moore's Law) allowed industry to almost fill the first half
of the chessboard with transistors
- using the formula: count = 2^(square-1) we get:
chess
square |
Exp |
transistor
count |
year |
comment |
1 |
0 |
1 |
1960 |
doubles yearly until 1971 |
2 |
1 |
2 |
1961 |
|
3 |
2 |
4 |
1962 |
|
4 |
3 |
8 |
1963 |
|
5 |
4 |
16 |
1964 |
Gordon Moore predicts Moore's Law |
6 |
5 |
32 |
1965 |
Article in Electronics Magazine |
... |
... |
... |
... |
|
30 |
29 |
536 million |
2007 |
|
31 |
30 |
1,07 billion |
2009 |
|
32 |
31 |
2,1 billion |
2011 |
last square of the first half of the chessboard |
33 |
32 |
4,2 billion |
2013 |
first square of the second half of the chessboard |
... |
... |
... |
... |
|
64 |
63 |
|
??? |
the far future provided Moore's Law can continue |
Epiphany 12: GPU's and Moore's Law
The original definition of Moore's Law stated that the number of transistors on a single chip would "double every year" while
"costing the consumer the same amount".
note: after the first decade Gordon Moore revised the doubling period to every 2 years).
I remember reading articles in the mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were
attributed to manufacturing problems related to "the limits of photo lithography" and "gate sizes so small that electrons would be
able to quantum tunnel across barriers thus rendering them conductors rather than semi-conductors"
Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they
charged more money. Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing
the consumer the same amount". So now we have an economic Moore's Law as well as a technical one.
Advances including
FinFET technology,
Tri-Gate
technology,
Gate-all-around (GAA) technology, and
3D
IC Stacking have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of
the fact that the computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from
Nvidia have beat the technical Moore's Law fore the last few product iterations.
Nvidia science cards (graphics cards without a video connector) break Moore's Law every year
Epiphany 13: The Second Renaissance (Storm before the Calm?)
The Renaissance was a period in European history, from the 14th to the 17th century, regarded as the cultural
bridge between the Middle Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in
a renaissance.
- Before Johannes Gutenberg introduced printing to Europe in
1439, it is safe to say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology
enabled an alternative view of Christianity to flourish which we now refer to as the
protestant reformation. Many people thought it was their duty to reject this change which eventually plunged Europe into
the Thirty Year's War (1618-1648). While many Europeans died
unnecessarily, European culture survived. The main take away from Gutenberg's technology is this: because many more people were
able to get access to books, many learned to read. Some even learned to question church authority.
- Galileo Galilei is best known for employing a telescope (a.k.a.
Dutch Spy Glass) to observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes
version of the story has him going up against the Vatican who clung to their belief that every heavenly body orbited Earth which
was located at the center of god's universe. Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live
out the remaining years of his life under house arrest (he was too well known to be put to death). YOU WOULD THINK that Galileo
could have better defended himself by convincing his inquisitors to look into the telescope eyepiece while it was pointed at the
moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device was the work of the devil.
It needs to be pointed out that without a telescope to extend human senses, it is very difficult to
determine if "the sun orbits the earth" or "the earth orbits the sun" (although non-telescopic measurements by Tycho
Brahe came very close but you needed to be a mathematician to interpret the results). This is the beginning of an era
where technology was used to extend the limits of the human senses.
- Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope
which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to
extend human senses into a new realm.
- There have been countless improvements in telescopes and microscopes since those days but it would appear that humanity had
hit a new limit. I say "had" because the inventions of the smaller while more powerful computers, along with computer networking
via the internet, have ushered in a new age of astronomy and biology where we have put electronic eyes on our machines. For
example, computers associated with the Kepler Spacecraft
are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers,
robotic sequencers, and the internet enabled the first two Human
Genome Projects as well as the Human Proteome Project
Observations/Speculations
- The internet actually gets its start in 1969 as ARPAnet but is never
taken seriously by the world until 1991 when Tim Berners-Lee
invented the World Wide Web to solve a document sharing problem at CERN. (yep, the American military invented the internet but European scientists
made it useful to the world; It would appear that this new renaissance began in Europe as well)
- Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita
basis) than did the efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
- The downside of the internet is that it has facilitated communications between crazy people and their conspiracy theories. It
has also facilitated a large crop of radio-like internet broadcasters which further divided the population into political
tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election of a real estate tycoon for
President of the United States. On the flip side, society might not be as divided as we think: While the political left dukes it
out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only
represent 25% each.
- This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries
ago. While some people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty
Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of
non-violence; not war or inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim
Berners-Lee invented the web, we don't need to engage in a thirty year's war to put an end to our social madness. Just as
Gutenberg introduced technology which enabled the advance of science while (eventually) reducing the importance of religious
dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing of the past.
Epiphany 14: Something in the water of Nordic-Scandinavian countries?
Epiphany 15: Big Data renders first-past-the-post elections as irrelevant
This next section requires the reader to have some familiarity with a few terms
- Demographics and Demographic Analysis
- Psychographics
- Big Data
Facebook and 'Cambridge Analytica' - a summary of what happened
Wikipedia:
American presidential
election of 2016
Wikipedia:
BREXIT vote of 2016
- Facebook sold access to their customer data. One of their
business partners included Global Science Research
- Social media outlets have always done this. It is how you get to use their platform free-of-charge.
- This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements
which most people never read
- In 2014, Global Science Research co-director Aleksandr Kogan created a Facebook app / questionnaire called This Is Your Digital Life
- in all versions of this story, this app was also used to collect your Facebook "Likes and Dislikes"
- in many versions of the story, this app also collected the "Likes and Dislikes" of your FB friends
comment: since those secondary level people were never presented with an acceptance agreement, this
seem unethical if not illegal
- Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica, a data mining firm:
- who sold information to AggregateIQ which was hired by the Leave
Campaign prior to the British BREXIT referendum
- now known to be working for the 2016 political campaigns of Ted Cruz and Donald Trump
- The results of the voluntary questionnaire were run through a psychographic analysis which resulted in
the Facebook participants being slotted into 27 categories
- It is now believed that Cambridge Analytica collected data on 87 million Facebook users
- Some of the categories identified Facebook people who...
- never vote
- always vote Republican (or conservative)
- always vote Democratic (or liberal)
- are political centrists -and- who might be convinced to vote one way or the other
- This last group was targeted with advertising (which may have masqueraded as fake
news) with the intent of convincing some to vote in the desired direction (pro-Trump or pro-BREXIT) or just stay home.
This work well in any a first-past-the-post (winner-take-all) election or plebiscite.
Comments - Observations
- Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics,
democracies need to shift to proportional representation.
- Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.”
- I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans do this they will be blind to
the effect of social media in the next election
Supporting Material
Epiphany 16: Industrial/Technical Evolution (part-1)
When you talk to anyone about the Industrial Revolution most
only think about one big change between 1760 and 1860. But this is a gross oversimplification if you consider that it started with
the age of steam, then transitioned to the age of electricity, then
transitioned to the age of information. When we talk about technology in the information
age should we begin with computers, or should we first start with getting information to people? (e.g. telegraph, telephone,
radio, television, cable television, internet) You might even wish to begin with scrolls, newspapers, books)
The Evolution of Locomotives
steam (wood + coal) >> petroleum >> electricity
Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn wheels. Europe
quickly experienced wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. It is now
known that humans burned petroleum for over 5,000 years but it wasn't until the mid-1800s
that the industrial revolution commercialized petroleum extraction and refinement.
Steam locomotives eventually morphed into diesel locomotives where
the fuel is burned in combustion engines to directly operate pistons (e.g. no intermediate steam is required). But the immense
power was difficult to control via a mechanical transmission so diesel locomotives morphed into
diesel electric systems where "a diesel engine runs an electrical generator" which is then used to power electrical motors.
At this point you can see that if an external electricity source is available then a locomotive might be made much more efficient
(lighter) by doing away with the diesel engine and electrical generator.
The Evolution of Computers
Hardware
- Before computing machines, Computer was label was reserved for people who did calculations
- Electro-mechanical computers were popular in the 1940s (using relays developed for the telephone industry)
- One good example of this is the American computer known as ENIAC
- Tommy Flowers was an engineer for the British Post Office (which
ran the British telegraph and telephone systems) who helped build Colossus.
Contrary to popular belief, Colossus was a hybrid system mostly comprised of electrical relays although vacuum tubes (which
the Brits correctly refer to as Thermionic Valves) were installed in key places of the CPU.
- Vacuum tube based computers were popular in the 1950s
- Transistor-based computers were popular in the 1960s
- Integrated circuit-based computers became popular in the 1970s
- Integrated circuits have morphed from small scale integration, to medium scale integration (MSI), to large scale
integration (LSI) to very large scale integration
also known as (VLSI)
- Some examples of MSI + LSI are chips known as ALUs (arithmetic logic units) but chip evolution eventually reached a stage
where a whole CPU could be build using one chip, and that chip was known as the Intel
4004
- Personal Computers (short list):
- mostly skipping over a discussion of industry's shift from CISC to RISC
- the internal CISC/RISC wars were reminiscent of the Holy Crusades (lots of religious dogma)
- RISC won out and the companies that didn't adopt it went out of business except Intel
- I still don't know if x86-64 is CISC or RISC; Intel says it is RISC but the base instructions are the same CISC
instructions from the 8086; Intel says they decompile CISC instructions into RISC instructions (which are run on an internal
RISC machine) but this sounds an awful lot like minicomputers of the 1970s and 1980s that translated CISC instructions into
microcode which was run through a sequencer (eg. open one or more data paths; move the data; close the paths; strobe the
ALU; etc.)
- both x86 and x86-64 supported streaming instructions (single-instructions :: multiple-data) which, IMHO, are definitely
CISC.
- computers get really fast when streaming instructions are moved onto external graphics cards
Software Systems
- NoOS
- The first computer I ever used was an Interdata Model 70. That particular installation had no operating system which
meant the program ran on bare metal (er, silicon) after being fetched from a cassette tape drive. In those days, programmers
who wrote stand-alone applications directly communicated with their own device (e.g. no device drivers). Imagine the
difficulty in writing your own routines for data-error-detection and data-error-correction.
- TOS
- Since the Interdata
Model 70 was an architectural clone of the IBM-360 it meant that third-party software libraries could be purchased
which provided ready-built routines for communicating with various devices including: serial interfaces (printers and
terminals) as well as tape storage devices like cassettes and 9-track
tape decks. These software libraries coexisted with standalone programs and were informally known as Tape
Operating Systems (TOS)
- DOS
- On the next iteration, computer manufacturers themselves provided (for a fee) Disk Operating Systems
(DOS) with numerous predefined libraries so it was no longer necessary for programmers to reinvent the wheel for every
computer solution under consideration. After all, who would know better how to communicate with a disk drive than the
engineers who built it?
- This name is a bit of a misnomer since a DOS could also support all the devices before it including block-structured tapes
- NOS
- All manufacturers inserted network routines into their DOS systems which would allow programmers to utilize the
manufacturer's proprietary networking protocols. But I never heard the phrase Network Operating System
(NOS) until SUN Microsystems published SunOS/Solaris which had built-in support for TCP/IP (The Internet) as well as support
for disks and tapes.
- This name is a bit of a misnomer since a NOS could also support all the devices before it including disks and tapes
- Most people realized that networking improved human-to-computer connectivity but a few visionaries realized that computers
could now connect to other computers.
- CLOUD
- Cloud computing is all the rage this side of y2k with numerous definitions of what is meant by that phrase. The most
popular one is that, like a cloud, humans are not really certain where they are currently working (could be any bunch of
computers in a data center; or could be in any one of a number of data centers around the world; no need to worry about a
computer blowing a power supply or a drive going bad because their are redundant copies of everything everywhere; hardware
is now so cheap that it makes more sense to protect the user's data). One software paradigm called IaaS
(infrastructure as a service)
can be implemented with a free product called OpenStack (this is just
one example of many)
Observations:
- Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software
together are multiplicative (think: logarithmic on steroids)
- Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44
employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry
would never get over the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor
obsolete
- Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their
favorite computer hardware and OS
Epiphany 17: What is old is new again (Python vs BASIC)
- I just learned (mid-2018) that Python is now a
more popular web programming language than JavaScript. While JavaScript is usually used client-side and Python is usually used
server-side, Python can also be used interactively as well as standalone on almost every computer platform in use today.
- Python is an interpreted fourth-generation language meant to replace BASIC
(a third generation language which began by being mostly interpreted but today is mostly compiled).
- Why is Python not compiled?
- computer systems this side of y2k are so fast that there is little difference between interpreted and compiled "for some
applications"
- (send a command to the internet then wait; receive some data then send a command to a relational database then wait;
in this instance Python acts more like the conductor of an orchestra)
- many cloud computing environments (known as heterogeneous clouds) are comprised of different CPU technologies and/or
operating systems. For your web application to run anywhere you would either "need precompiled binaries for each CPU flavor"
(bad) or "would need to run an interpreted script" (good)
- Python is already very fast and I don't know why or how (kind of reminds me of some
Forth programming I did in the 1980s)
- Humorous Observation:
- Before I started my computing career, I learned interpreted BASIC on: Heathkit
H8 , Apple II and TRS-80
(each implementation was different)
- Moving to compiled languages (COBOL, FORTRAN, Pascal, HP-BASIC, VMS-BASIC, C, C+) showed the true
power of computers
- I am ending my career learning interpreted Python 3.6 on Linux (CentOS-7) as well as Windows-10
- Unlike the "many chefs" debacle 40 years ago with BASIC, Python development is in the hands of one organization (The
Python Software Foundation) which means that we will only see one flavor of the Python language although there are
already two. Be aware of the fact that (in 2019) many internal Linux utilities (like YUM and FIREWALL-CMD) depend upon
Python-2.7 which means that sloppy additions of Python-3 to your system will break Linux utilities as
I have documented here (yes, multiple instances of Python can be supported on the same Linux system if you are careful
-AND- if your Python scripts begin with a shebang)
- Noodling Around on your computer
- IMHO, the personal computer revolution of the 1970s and 1980s was important to our society because it allowed
non-specialists to noodle around with ideas in BASIC. Anyone who has tried to do serious work in BASIC will soon learn about
implementation limitations (e.g. maximum size of integers or limited precision of floats). Moving to other languages is not
much different. For example, it is almost impossible to write a BASIC demo for the
Diffie-Hellman Key Exchange and difficult to do it in C without resorting to non-standard libraries. But check out
these hacks in Python 3.7
- click here to see my Python hacks involving DFT + FFT
#!/bin/python3
# author : Neil Rieck
# created: 2019-08-22
# purpose: demo to show that Python is better than BASIC # (and most other languages) for serious work and/or noodling
# -----------------------------------------------------------
import math # math library
#
print("pi :",format(math.pi,'.48f'))
print("1/3 :",format(1/3,'48f'))
print("2^32 :",2**31) # 32-bit signed int (BASIC fails)
print("2^64 :",2**63) # 64-bit signed int (BASIC fails)
print("2^128 :",2**128) # difficult in many languages
print("2^256 :",2**256) # ''
print("2^512 :",2**512) # ''
print("2^9999 :",2**9999) # ''
print("2^99999:",2**99999) # works in Python
|
OUTPUT
pi : 3.141592653589793115997963468544185161590576171875
1/3 : 0.333333
2^32 : 4294967296
2^64 : 18446744073709551616
2^128 : 340282366920938463463374607431768211456
2^256 : 11579208923731619542357098500868790785326998466...
2^512 : 13407807929942597099574024998205846127479365820...
2^9999 : 99753155844037919244187108134179254191174841594...
Note: "..." is where I chopped the line for this display
|
Epiphany 18: Industrial/Technical Evolutions (part-2)
Food-for-thought:
- While there are no official dates, it appears to me that industrial revolutions happen almost every hundred years starting
between 1760 and 1765 (on average)
- It goes without saying that each revolution merges with previous revolutions (heavier-than-air travel first occurred in 1903
which firmly in the middle of the second industrial age)
- will the third industrial age end earlier as we now enter the age of biology -OR- is biology jut another form of information
technology which is much easier to research with computers, robotic sequencers, and the internet?
- Companies like Google and Amazon Web Services are now more valuable than their fossil fuel equivalents which is proof that
Information is more valuable than energy
The First Industrial Age (Age of Steam) years: 1760-1860
- History tells us that steam engines existed before 1800 but many visionary's saw what was coming. For example, Adam Smith
worried about what might happen when machines replaced people. He advocated government-run schemes to cut weekly hours-of-work
(which would put more people to work) as well as reeducation programs. These were some of the reasons for writing his two main
books:
- The Theory of Moral Sentiments in 1759
- An Inquiry into the Nature and Causes of the Wealth of
Nations in 1776 (same year as the American Revolution)
- James Watt introduced his improvement to the steam engine in 1776 which
made more efficient use of coal as a water-pumping device in mines. Note that this modification made steam engines more fuel
efficient, thus practical.
- William Murdoch writes a patent
application on the steam locomotive in 1784
- So why do historians pick a general starting year of 1800? I think that this is when a particular technology reaches a
critical mass where it affects most of society.
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
- Caveat: in 1859, petroleum was discovered in Petrolia
Ontario and Titusville
Pennsylvania. These discoveries enabled new locomotive designs to flip from steam to the direct burning of petroleum.
Gasoline (one a waste product of petroleum refinement) enables the automotive industry. This second wave of powered machines
occurs in parallel with the age of electricity.
The Second Industrial Age (Age of Electricity) years: 1860-1960
- The first time an electrical telegraph was
suggested was 1753 and yet biographies of Michael Faraday and James
Clerk Maxwell makes me wonder if anyone had a clue before 1880. For example, Georg
Ohm published Ohm's Law in 1827 and yet, many people who thought
they knew better were responsible for damaging underwater communication cables as late as 1858 as documented
here.
comment: If those know-it-alls had ever bothered to learn Ohm's Law while developing a working knowledge
of electricity, they would have never cranked up the voltage (which resulted in a breakdown of the gutta-percha insulation.
- Alexander Graham Bell and Elisha
Gray co-invent the telephone in 1876
- Thomas Edison was generating DC power from the combustion of coal as
early as 1882
- Nikola Tesla invents the polyphase
distribution system in 1888 which is then licensed to Westinghouse
- George Westinghouse was
generating AC from the fall of water (hydroelectricity) as early as 1886 in Great Barrington, Massachusetts.
- The spark-gap experiments of Heinrich Hertz from 1886-1888 leads
to the invention of the first practical wireless radio by Guglielmo
Marconi in 1895
comment: field-theory by Maxwell was never taken seriously until Hertz published the results of his
experiments
- Although there had been numerous attempts to generate DC power (which could not be efficiently transmitted more than a few
dozen miles) at Niagara Falls, the first AC power scheme doesn't
go online until 1899. This changes everything.
- Just as in the previous example with steam, electricity use reaches a critical mass around 1900
- From this point on, electrical power is used to replace human power wherever possible (industrially then domestically)
- This side of 1900, radio is developed as an analog audio delivery technology based upon modulation schemes: AM, FM, and PM
- This technology is further developed as analog television (black-and-white then color)
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
The Third Industrial Age (Age of Digital Data / Age of Information) years: 1960-2010?
- Electromechanical Computers (processor generation-1)
- Vacuum tube Computers (processor generation-2)
- The Colossus computer first appears in 1943 Britain as a
German code-breaking tool
- ENIAC was the first vacuum tube computer created by the Americans in
1946
- Transistorized Computers (processor
generation-3)
- The Transistor Computer first
prototype was created in 1952 by the University of Manchester
- Boroughs, UNIVAX and IBM all manufacture transistor computers in 1957
- Integrated Circuit Computers (processor generation-4)
- General Electric produces the GE-600 series in 1965 which
contains some ICs.
- Integrated Circuits help manufactures create the minicomputer
industry; up until then the big players made most their money creating Mainframe
Computers
- Microprocessor Computers (processor generation-5)
- Bell Labs starts the UNIX OS project in 1969
- The Internet
- With initial funding from DARPA, the very fist appearance of an IP-based
internet occurs in 1969
- This morphs into a second-gen internet
in 1982 (now features both UDP + TCP)
- With funding from CERN the first appearance of the
world-wide-web occurs in 1991
- Amazon begins online retailing in 1994
- With funding from a NSF grant, Google
develops their very successful search engine in 1998
- Cloud Computing and Big Data
- In 2003, Google publishes a description of GFS (Google File System) in ACM which triggers Clouding Computing as described
above
- From then until now we have seen the information age transform everything from banks, stock markets, real-estate, retail,
whatever. Can anyone remember what life was like before Google Street View on your Smart Phones?
- Surprisingly, changes in retail purchasing habits is causing the closure of numerous shopping malls around the world;
especially in North America and Europe
- This has really changed manufacturing where JIT (just-in-time) information technologies ensure that parts are available as
they are needed rather than being warehoused.
- Also, many IT jobs were out-sourced to so-called third world locations then using the internet.
- Although new technology jobs were created, the number of new jobs never matched the jobs lost.
comment: so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost
jobs" those politicians are either lying or ignorant of the facts just presented. Truth be told:
- 2 million North American jobs were lost between 1990 and 2000 (due to automation -and- outsourcing)
- 6 million North American jobs were lost between 2000 and 2005 (due to automation -and- outsourcing)
Speculation: The Fourth Industrial Age (Artificial Intelligence, deep learning, and more?)
years: 2010-now
- Desiring self-driving cars, trucks, tanks and drones, DARPA
has been funding the development of
self-driving vehicles since 2004. Many industry watchers were convinced that this technology would not be ubiquitous until
2030 but it now appears that this technology could be ready (in one form or another) by 2020 partly due to advancements in "A.I.
and deep learning"
comment: Since 94% of all auto accidents are caused by human error, this technology (which will be
demanded by the insurance industries) will wipe out employment in the last large group of unskilled people: car drivers, truck
drivers, taxi drivers, limo drivers and bus drivers
- The benefits of "A.I. and deep learning" have benefited most internet users since 2010 although most people have no idea this
invisible revolution has been taking place with the biggest application being human language translation (improvements are ever
on-going)
- Most non-nerds ignored Google's 2012 "A.I. and deep learning" announcement about being able to use computers to locate which
YouTube videos contained pictures of a cat (or not)
- Most A.I. researchers claim they always knew that A.I. could someday easily beat a grand master Chess but never beat anyone at
the game of Go. So most non-nerds just shrugged when DeepMind
(a division of Google's parent company, Alphabet) announced in 2015 that their "A.I. and deep learning" program named AlphaGo
had just beat the world champ in Go
- Two years later in 2017, DeepMind announced AlphaZero
variations of their Chess and Go solutions. Here the word Zero means "zero knowledge from humans" (other than the simple rules
of the games). You program two similar systems then allow them to play each other many millions of times per night (in this way
they gain experience at inhuman speeds)
- During Tesla's "A.I Day" (2021-08-19) it was announced that Tesla was moving training of their vehicle's A.I. system into a
game-world where many millions of miles of driving experience will be gained without any loss of human life. Sounds not unlike
what AlphaZero was doing with Go and Chess.
Epiphany 19: The dominance of C/C++ (part 3/3)
In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside
operating
systems for a moment, those vendors also sold programming tools including programming language compilers like
Fortran
(1956),
ALGOL (1958) and
COBOL
(1959). One of the most important changes to the computing industry occurred when computer professionals got together to standardize
computer languages.
One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC
language was once very popular (I am still forced to use VMS-BASIC
every day by my employer in 2019) it has gone through a very small number
of standardizations. This might have something to do with the fact that many BASIC implementations were so different that
standardization was not possible or, perhaps, desirable. On the other hand, languages like C, C++ and Objective-C have gone
though numerous standards and continue to be improved.
For example, non-standard "C" first appeared in 1972
and now referred to as K&R C after its authors, Brian
Kernighan and Denis Ritchie. Improvements were formalized then published as C89 in
1989 by ANSI and C90 by ISO. This continued with the names C99, C11 and C18 as described
here.
comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average)
whilst C++ moves to a new level approximately every 3 years (on average)
Linux
Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux,
it should be no surprise that it is the Linux community that is pushing newer versions of C and C++
Since gcc is used to build Linux, we should look at this toolset a little closer
Oddities
- The name game
- Originally, gcc meant Gnu c compiler and could compile both
C and C++ programs based upon the file extension. The support was added for Objective-C
- Then GCC (uppercase) was created which means Gnu
Compiler Collection
- File extensions
- On most systems including Linux, a file extension of ".c" indicates "this is a 'c' source file"
- On most systems including Linux, a file extension of ".cpp" or ".c++" indicates "this is a c++ source file"
- Only on Linux systems have I seen that a file extension of ".C" (uppercase) also indicates "this is a c++ source file"
- On older Linux systems, these commands:
- gcc --version
- c++ --version
- g++ --version
appear to bring up the same executable (at least this is true before gcc-4) but this is not the case since gcc-4.8 (perhaps it
is just too difficult to support C11 and C89 and C99 'c' standards as well as all the new C++ standards in one Swiss-army knife)
- On a modern Linux system (like CentOS-7.5) supporting gcc-4.8 we might see this.
BASH command |
RHEL/CentOS executable |
comment |
gcc |
/usr/bin/gcc |
same executable |
c++ |
/usr/bin/gcc |
same executable |
g++ |
/usr/bin/g++ |
can support newer language standards |
- Most people reading this probably compile their "c" programs using a command-line switch specifying C99 but what do you do if
you want to use a higher standard like C11? If you are running gcc-4.8 then you are good to go (just use
the appropriate command-line switch). But if you are running gcc 4.4 then you might need to go though a
couple of other steps: https://edwards.sdsu.edu/research/c11-on-centos-6/
- Over the years, some companies, including Microsoft, have been telling "c" developers to use c++03 rather than compile with
c99. (huh?)
- Speaking about GCC for a moment, most people seem to be unaware of this: /usr/bin/gfortran
Make(s)
One thing usually overlooked in this ecosystem begins with the Unix
Make utility first written in 1977 by Stuart Feldman of Bell
Labs. This was the easiest method to get an application running on another Unix implementation which may be running on a
completely foreign computer architecture. I'm jumping a lot of steps here but after a time, SUN Microsystems thought that Make was
getting a little long-in-the-tooth so developed a product called "Solaris Package Manager". It should surprise no one that Red Hat
needed something in 1995 to move to Red Hat Linux 2 so further developed the SUN product calling it "Redhat Package
Manager" (file extensions are '.rpm')
Epiphany 20: The Apollo 11 moon landing (50 years later)
The manned spacecraft programs in both Russia and the United States changed the world in more ways than most people would ever know.
First off, humanity cannot ignore the contribution of the Soviet space program because
the United States would have
never attempted such a thing if it were not for cold-war politics. The plain fact is this: many Americans didn't care about
Apollo but did care about beating the Russians at something. History informs that President Richard Nixon and Henry Kissinger were
plotting to terminate the Apollo Program at the very time that Nixon was congratulating, via telephone from the Whitehouse,
Apollo
11 astronaut Neil Armstrong who was standing on moon (both Nixon and Kissinger thought that Vietnam was a more urgent issue;
more on this in a moment). To add insult to injury, many Americans lost interest after the apparently routine flight of
Apollo
12 (e.g. we won that race twice; time to move on). Proof of this can be seen in the number of citizens who complained to the
TV networks about favorite TV programs being preempted by spaceflight coverage. Heck, in-flight video transmissions from
Apollo
13 were not aired until after a tank explosion placed the mission in jeopardy. Part of this public disinterest led to
cancellations
of Apollo flights 18-20.
FACT: development of the Apollo Guidance Computer (AGC) was
the trigger event for the largest amount of human technological progress. The design work was done by Draper
Labs at MIT while manufacturing was done by Raytheon. Why was the AGC necessary? Initially, many astronauts and cosmonauts
incorrectly thought that human pilots would be able to directly fly spacecraft much in the same way that pilots flew aircraft.
Consider this thought experiment: you are flying the Lunar Module
and need to catch-up-to, then dock with, the Command
Module. For simplicity, assume that both vehicles have identical orbits and velocities but are separated by a distance
of 1000 m (3280 f). Without thinking about life outside of the atmosphere, you fire your RCS
thrusters (force: 100 pounds or 444 Newtons) while aiming at the Command Module. This will increase your velocity which pushes you
into a higher orbit. Your new orbital velocity is faster but your orbital time is now slower. This causes the Command Module to
quickly pass under you making it impossible to dock. One correct solution dictates that you should fire your RCS thrusters away
from the target vehicle, which will cause you to drop into a slightly lower orbit; then wait a short period of time; then fire
your RCS thrusters in the opposite direction which should return you to the original orbit as the CM but hopefully much closer
(BTW, first firing forward then quickly firing backward produces the same result). Remember "F = ma" from Isaac
Newton's second law? Since "a = dV/dT" then the second law can be rewritten as "F = m x dV/dT" which becomes "F x dT = m x
dV". (the left side of the equation is known as impulse).
The instantaneous mass of the LM (which decreases every time you fire your thrusters) determines how long you should fire
them in every maneuver (e.g. two one-second thrusts will not produce identical results; a one-second forward burn cannot (exactly)
be cancelled by a one-second reverse burn). These real-time calculus solutions are best determined by a guidance computer because
fuel is limited so must be conserved.
How did the development of the AGC improve things here on Earth? First off, commercial mainframe computers in the 1960s were
manufactured from discrete electronic components, including individual transistors and diodes. So when IBM learned that the AGC
computer had to fit into a volume the size of a bread-box (one cubic foot or 28,316 cc) many IBM engineers didn't think it was
possible. The Draper/Raytheon solution employed "integrated circuits" (semiconductor chips containing numerous transistors) which
they were already using in a more primitive way inside Polaris missile guidance systems. The high per-component prices meant that
the American government was their primary customer (Apollo consumed 60% of the IC developed by America in 1966). Because of high
cost, government contractors developed semiconductor test methods to ensure that the government would only pay for components that
met design specifications. These testing methods eventually migrated from the customer (government) back to the manufacturing
industry which resulted in affordable chips for the rest of us. That revolution in chip manufacturing produced things like:
- the development of minicomputers (based on chips) that were more powerful and reliable while less expensive than existing
mainframe computers (based upon discrete components)
- early minicomputers acted as packet routers on a prototype internet known as ARPANET
- work on minicomputers caused Bell Labs to create a portable operating system known as UNIX as well as the "C" programming
language. After the breakup of the telephone system in the 1980s, corporate greed caused American UNIX to be replaced with
European Linux.
- Moore's Law (see: Epiphany 11 just above)
- the development of microcomputers for use in minicomputers (technological feedback), then the development of workstations
which eventually became just as powerful as minicomputers
- work stations were used to create the first web servers and web browsers when a British contractor, Tim
Berners-Lee, created the world-wide-web while working for CERN in
Switzerland.
comment: many Americans today falsely believe that the world-wide-web was an American invention
- microcomputers became the basis for internet packet routers (CISCO
Systems) as well as commercial and residential firewall appliances
- personal computers which eventually became just as powerful as work stations
- gaming requirements triggered the development of graphics cards and GPUs (graphics processing units which act like
thousands of special purpose CPUs)
- gaming consoles (PlayStation, Xbox, etc.)
- game consoles triggered the move from CRT technology to flat screens (the first LCD display was developed for the Sony
PlayStation 1)
- development of monochrome and color LCD displays feedback into the computer industry to replace CRT-based monitors
with LCD-based monitors. This business morphs into very-large digital displays in commercial movie theaters which
trickles down to large screen TVs in residences
- phones, tablets, phablets, pads, e-book readers, smart phones, smart TVs all connected via the internet
comment: Apollo used 3-input NAND gates manufactured by Fairchild
Semiconductor. Engineers leave to form Intel then later, AMD
Software jobs also changed drastically during this time. While it is true that high-level programming languages like FORTRAN
(1957) and COBOL (1959) existed, the phrase "computer programmer" did not yet exist as computer programming was
primarily done by mathematicians. High-level languages required more memory and CPU power then what was available on the
AGC, but they were employed on mainframe computers used "to run AGC flight simulations" then "generate the necessary binary code"
for the hand-made read-only core rope memory used to hold AGC
programs. The level of redundancy built into the AGC programs (see reference-1) should inform that these
people were doing "computer engineering". Click
Margaret Hamilton to see what I mean.
Critics of Apollo mention that the program was too expensive in that it was consuming too much of the national budget with 4%
being the oft quoted number. Let me remind everyone that cold-war concerns at the time mandated that the Defense budget was kept
secret. On top of that, no American citizen knew how much money was being used to support the Vietnam War. Today we know that the
total cost of Apollo (in 1968 dollars) was $25 billion whilst the cost of the Vietnam War (also in 1968 dollars) was $168 Billion
dollars. Now everyone knows that it is way harder to create something than it is to destroy something so allow me to state the
obvious: America got no return on the $168 billion investment. Check out the next chart then advise your
political representatives accordingly:
Activity |
Cost |
Human Cost |
RIO
(return on investment) |
Notes |
Apollo
Manned
Spacecraft
Program |
$25 Billion
(1968 dollars) |
3 astronauts |
metallurgy, semiconductors,
computers, ARPAnet (1969)
which morphed into the
internet (1982)
admiration of the whole world |
During the peak years, the Apollo program
employed ~ 400,000 scientists, engineers and
technicians across 20,000 companies. Much
of this work was done by, or managed by,
defense contractors. |
Vietnam
War |
$168 Billion
(1968 dollars) |
58,000 |
US soldiers killed |
200,000 |
US soldiers injured |
2 million |
Vietnamese civilians |
1.1 million |
North Vietnamese
fighters |
200,000 |
South Vietnamese
soldiers |
50,000 |
Laos civilians |
259,00 |
Cambodia civilians |
|
agent orange
contempt of the whole world |
During peak years, more than 400,000
American soldiers were committed to Vietnam
(almost the same number of people tied to the
manned spaceflight effort).
Despite hearing crazy political justifications like
"the domino theory", America lost this war but
no literal or metaphorical dominoes were ever
observed. |
First Gulf
War |
$61 Billion |
382 US military
others: casualties |
American defense contractors
do well |
first use of "depleted uranium"
by the Americans |
Middle-East
Wars |
$5.9 Trillion |
??? |
American defense contractors
do well |
Does everyone know that
"1 trillion" = "1,000 billion" ?
First use of: Extraordinary rendition |
war references:
- https://fas.org/sgp/crs/natsec/RS22926.pdf (published by:
Congressional Research)
- https://en.wikipedia.org/wiki/Vietnam_War_casualties
-
https://www.irishtimes.com/news/world/asia-pacific/death-from-below-in-the-world-s-most-bombed-country-1.3078351
-
https://www.cnn.com/2013/09/15/world/meast/gulf-war-fast-facts/index.html
-
https://www.cnbc.com/2018/11/14/us-has-spent-5point9-trillion-on-middle-east-asia-wars-since-2001-study.html
- comment: Since world-war-two ended in 1945, the United States of America has lost every war. But no
one seems to care as long as the economy is humming along.
comment: Many Americans continually bang-on about the USA being a Christian nation but I wonder if they will
ever turn their spears into pruning hooks as is mentioned in
Isaiah 2:3–4
What About China?
Back in the 1960s, conservatism revolved around business interests which caused some Americans to wonder if China was a missed
business opportunity. This was the main reason for Nixon and Kissinger
opening relations with China in 1972. IMHO this trigger event started the Chinese shift from agriculture to industrialism.
(think of this as an industrial revolution confined to one country)
American companies and schools in the 1980s found themselves in the Reagan era of "minimal government" which now goes by the name
austerity. This might have translated into greater economic problems, including unemployment, except for the actions of China's
leader, Deng Xiaoping, who favored "maximal government" so was paying
to send thousands of Chinese students to the USA every year to be educated.
I personally experienced this in 1985 Boston: we had morning
lectures and afternoon labs. An English-speaking Chinese student sat one row ahead of me in the lecture hall accompanied by two
minders who could not speak English but were required to pay for student slots (these minders were there to ensure the student
would return to China; they passed the day in class by reading little brown books of political dogma). Back then, Americans
correctly welcomed these foreign students (it was a business opportunity) but no one ever thought that China would eventually
compete head-to-head with the USA. I applaud the Chinese students who were able to acquire an education in a foreign country
speaking a foreign language but wonder how many Americans would be willing, or able, to do the same by traveling to China.
Comments:
- China is investing in their citizens while America is investing in their military
which makes me wonder if America will, one day, recognize "an education gap" much in the way they became
mobilized by the Soviet launch of Sputnik 1 in 1957. Many IQ charts
(including these two: here and here)
show average IQ levels in Hong Kong and Singapore (score: 109) are a full eleven points
ahead of the USA (score: 98). Also notice that countries with high levels of religious extremism tend to have lower
IQs.
- since the 1980s, China has moved more
than 300 million Chinese citizens from the poor agricultural class into the middle class. This is approximately the same
size of the total population of USA where "middle class" is becoming a fading memory.
- Since American educators showed China how to modernize, I can't seem to stop thinking about the story of "the Tortoise and the
Hare". If Charles Dickens were alive I am certain he would be toying with writing a book titled "A Tale of Two Countries"
Steps to MAGA (Make American Great Again)
- ELIMINATE OFFENSE SPENDING
- The colored chart (28-lines above) provides proof that many millions of people would be alive today if the USA had
not been funding foreign wars on foreign shores. One way to make American great again is to stop funneling tax payer
money into those defense programs which, in realty, are offense programs used to support the American empire
- FUND SCIENCE and TECHNOLOGY
- Like an addict that is unable to quit cold-turkey, defense contractors will probably not be able to survive having their
government funds reduced. But just as happened during the 1950s and 1960s, defense contractors could be employed by the
government to do big science projects though civilian agencies like NASA
- During the 1990s, the
American defense budget was always under $300 billion per year. By 2010 the defense budget had climbed to $721 billion
per year. Meanwhile, NASA's budget has fluctuated between $17 to $20 billion since 2006. If NASA's budget was doubled
by diverting money from the defense budget would the Pentagon even notice? And yet we know that spending on NASA will have a
positive ROI (return-on-investment)
- DARPA is on American defense investment with a very high ROI
- FREE EDUCATION
- There are still many people alive today who will tell you that they attended Harvard University in the 1950s and only paid
$100.00 tuition
- Before defense contractors starting receiving the bulk of government monies, the American government used to subsidize all
college and university educations. Funding for education slowly dried up as money was diverted into the defense/offense
budgets.
- Once money is diverted from offense back into education, the whole economy will repair itself
Epiphany 21: Microsoft is promoting Python3 (Python version 3.x)
2020-05-06: Today’s alert from ZNET informs that Microsoft has added 51 Python videos to their previous 44. I read somewhere that
because Bill Gates had started programming in BASIC, that Microsoft could never really get rid of it (other than the massive changes
to "visual studio" between VS6 and VS.net where they made it play-nice with C/C++ and anything else that generated code under the
.net framework). I wonder what Bill would say about the shift to Python?
“Python for beginners” playlist from Microsoft (this is the original 44)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhS8VzuMCfQD4uJ9yne1mE6
notes:
- video #4 shows how to configure Video Studio Code which can now be acquired and used FREE of CHARGE
- Video Studio Code can integrate with a GIT repository including GitHub
“More Python for beginners” (this is the new 20)
https://www.youtube.com/playlist?list=PLlrxD0HtieHiXd-nEby-TMCoUNwhbLUnj
“Even More Python for Beginners” (this is the new 31)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhHnCUVtR8UHS7eLl33zfJ-
Epiphany 22: Machine Learning is a real thing
- While my technical life has contained a lot of "holy crap" moments, there have only been a few times where my epiphany
meter maxed out at 100%.
- Perhaps I am a bit late to this epiphany but there has been a quiet revolution happening in the area of machine learning
which seems to have reached critical mass in 2007 when the Python library Scikit-learn was made freely available by Google. Since then, this field has exploded with most work being
performed by internet companies. Why? In machine learning "data generates code" rather than the other way around so using the
enormous amount of data available via the internet to train your machine results in better code solutions.
comment: in 2021 humanity discoverers that data (or information) is much more valuable than energy
- Then in 2012 Google announced a breakthrough in the ability to detect a cat (or not) anywhere in youtube videos (Google's
training data).
- Since then, Alphabet/Google companies (DeepMind is one) have made huge breakthroughs in goal oriented game-playing machines like AlphaGo (2016), and AlphaZero (Go
and Chess in 2016).
comment: here, the word "zero" means zero human input (usually achieved by forcing two machines to pay several million games
with each other)
- Companies like Tesla employ as many as 1,000 people developing Machine Learning which the intend to use to implement Full
Self Driving (FSD)
- comment: these systems are not yet capable of consciousness. It may be better to think about them as super
neural networks (or super spinal cords)
caveats: both
machine learning and
artificial intelligence have a lot of history going back to the
1940 and 1950s so contains a lot of non-computer terminology. Modern computer technologists wishing to learn more might wish to
start here:
External Links
Back to Home
Neil Rieck
Waterloo, Ontario, Canada.