Technological Change (in my life) part-3
edit: 2024-12-04 (this is a stream-of-consciousness thing)
Epiphany 20: The Apollo 11 moon landing (50 years later)
The manned spacecraft programs in both Russia and the United States changed the world in more ways than most people would ever know. First off, humanity cannot ignore the
contribution of the Soviet space program because
the United States would have never attempted such a thing if it were not for cold-war politics. The
plain fact is this: many Americans didn't care about Apollo but did care about beating the Russians at something. History informs that President Richard Nixon and Henry
Kissinger were plotting to terminate the Apollo Program at the very time that Nixon was congratulating, via telephone from the White House,
Apollo
11 astronaut Neil Armstrong who was standing on moon (both Nixon and Kissinger thought that Vietnam was a more urgent issue; more on this in a moment). To add insult to
injury, many Americans lost interest after the apparently routine flight of
Apollo 12 (e.g. we won that race twice;
time to move on). Proof of this can be seen in the number of citizens who complained to the TV networks about favorite TV programs being preempted by spaceflight coverage.
Heck, in-flight video transmissions from
Apollo 13 were not aired until after a tank explosion placed the mission in
jeopardy. Part of this public disinterest led to
cancellations of Apollo flights 18-20.
FACT: development of the Apollo Guidance Computer (AGC) was the trigger event for the largest amount
of human technological progress. The design work was done by Draper Labs at MIT while manufacturing was done
by Raytheon. Why was the AGC necessary? Initially, many astronauts and cosmonauts incorrectly thought that human pilots would be able to directly fly spacecraft much in the
same way that pilots flew aircraft. Consider this thought experiment: you are flying the Lunar Module and
need to catch-up-to, then dock with, the Command Module. For simplicity, assume that both
vehicles have identical orbits and velocities but are separated by a distance of 1000 m (3280 f). Without thinking about life outside of the atmosphere, you fire your RCS
thrusters (force: 100 pounds or 444 Newtons) while aiming at the Command Module. This will increase your velocity which pushes you into a higher orbit. Your new orbital
velocity is faster but your orbital time is now slower. This causes the Command Module to quickly pass under you making it impossible to dock. One correct solution dictates
that you should fire your RCS thrusters away from the target vehicle, which will cause you to drop into a slightly lower orbit; then wait a short period of time; then fire
your RCS thrusters in the opposite direction which should return you to the original orbit as the CM but hopefully much closer (BTW, first firing forward then quickly
firing backward produces the same result). Remember "F = ma" from Isaac Newton's second law? Since
"a = dV/dT" then the second law can be rewritten as "F = m x dV/dT" which becomes "F x dT = m x dV". (the left side of the equation is known as impulse).
The instantaneous mass of the LM (which decreases every time you fire your thrusters) determines how long you should fire them in every maneuver (e.g. two one-second
thrusts will not produce identical results; a one-second forward burn cannot (exactly) be cancelled by a one-second reverse burn). These real-time calculus solutions are
best determined by a guidance computer because fuel is limited so must be conserved.
How did the development of the AGC improve things here on Earth? First off, commercial mainframe computers in the 1960s were manufactured from discrete electronic
components, including individual transistors and diodes. So when IBM learned that the AGC computer had to fit into a volume the size of a bread-box (one cubic foot or
28,316 cc) many IBM engineers didn't think it was possible. The Draper/Raytheon solution employed "integrated circuits" (semiconductor chips containing numerous
transistors) which they were already using in a more primitive way inside Polaris missile guidance systems. The high per-component prices meant that the American government
was their primary customer (Apollo consumed 60% of the IC developed by America in 1966). Because of high cost, government contractors developed semiconductor test methods
to ensure that the government would only pay for components that met design specifications. These testing methods eventually migrated from the customer (government) back to
the manufacturing industry which resulted in affordable chips for the rest of us. That revolution in chip manufacturing produced things like:
- the development of minicomputers (based on chips) that were more powerful and reliable while less expensive than existing mainframe computers (based upon discrete
components)
- early minicomputers acted as packet routers on a prototype internet known as ARPANET
- work on minicomputers caused Bell Labs to create a portable operating system known as UNIX as well as the "C" programming language. After the breakup of the
telephone system in the 1980s, corporate greed caused American UNIX to be replaced with European Linux.
- Moore's Law (see: Epiphany 11 just above)
- the development of microcomputers for use in minicomputers (technological feedback), then the development of workstations which eventually became just as powerful as
minicomputers
- work stations were used to create the first web servers and web browsers when a British contractor, Tim
Berners-Lee, created the world-wide-web while working for CERN in Switzerland.
comment: many Americans today falsely believe that the world-wide-web was an American invention
- microcomputers became the basis for internet packet routers (CISCO Systems) as well as commercial and
residential firewall appliances
- personal computers which eventually became just as powerful as work stations
- gaming requirements triggered the development of graphics cards and GPUs (graphics processing units which act like thousands of special purpose CPUs)
- gaming consoles (PlayStation, Xbox, etc.)
- game consoles triggered the move from CRT technology to flat screens (the first LCD display was developed for the Sony PlayStation 1)
- development of monochrome and color LCD displays feedback into the computer industry to replace CRT-based monitors with LCD-based monitors. This business morphs
into very-large digital displays in commercial movie theaters which trickles down to large screen TVs in residences
- phones, tablets, phablets, pads, e-book readers, smart phones, smart TVs all connected via the internet
comment: Apollo used 3-input NAND gates manufactured by Fairchild Semiconductor.
Engineers leave to form Intel then later, AMD
Software jobs also changed drastically during this time. While it is true that high-level programming languages like FORTRAN (1957) and COBOL (1959) existed, the phrase
"computer programmer" did not yet exist as computer programming was primarily done by mathematicians. High-level languages required more memory and
CPU power then what was available on the AGC, but they were employed on mainframe computers used "to run AGC flight simulations" then "generate the necessary binary code"
for the hand-made read-only core rope memory used to hold AGC programs. The level of redundancy built into the
AGC programs (see reference-1) should inform that these people were doing "computer engineering". Click
Margaret Hamilton to see what I mean.
Critics of Apollo mention that the program was too expensive in that it was consuming too much of the national budget with 4% being the oft quoted maximum number. Let me
remind everyone that cold-war concerns at the time mandated that the Defense budget was kept secret. On top of that, no American citizen knew how much money was being used
to support the Vietnam War. Today we know that the total cost of Apollo (in 1968 dollars) was $25 billion whilst the cost of the Vietnam War (also in 1968 dollars) was $168
Billion dollars. Now everyone knows that it is much harder to create than to destroy so allow me to state the obvious: America got no return on the $168 billion investment.
Check out the next chart then advise your political representatives accordingly or vote differently.
Activity |
Cost |
Human Cost |
RIO
(return on investment) |
Notes |
Apollo
Manned
Spacecraft
Program |
$25 Billion
(1968 dollars) |
3 astronauts |
metallurgy, semiconductors,
computers, ARPAnet (1969)
which morphed into the
internet (1982)
admiration of the whole world |
During the peak years, the Apollo program employed ~ 400,000 scientists, engineers and technicians
across 20,000 companies. Much of this work was done by, or managed by, defense contractors. |
Vietnam
War |
$168 Billion
(1968 dollars) |
58,000 |
US soldiers killed |
200,000 |
US soldiers injured |
2 million |
Vietnamese civilians |
1.1 million |
North Vietnamese
fighters |
200,000 |
South Vietnamese
soldiers |
50,000 |
Laos civilians |
259,00 |
Cambodia civilians |
|
agent orange
contempt of the whole world |
During peak years, more than 400,000 American soldiers were committed to Vietnam (almost the same number of people tied to the manned spaceflight effort). Despite
hearing crazy political justifications like "the domino theory", America lost this war but no literal or metaphorical dominoes were ever observed. |
First Gulf
War |
$61 Billion |
382 US military
others: casualties |
American defense contractors
do well |
first use of "depleted uranium"
by the Americans (a war crime?) |
Middle East
Wars |
$5.9 Trillion |
??? |
American defense contractors
do well |
Many American citizens do not know that "1 trillion" = "1,000 billion"
First use of: Extraordinary rendition |
war references:
- https://fas.org/sgp/crs/natsec/RS22926.pdf (published by: Congressional Research)
- https://en.wikipedia.org/wiki/Vietnam_War_casualties
-
https://www.irishtimes.com/news/world/asia-pacific/death-from-below-in-the-world-s-most-bombed-country-1.3078351
- https://www.cnn.com/2013/09/15/world/meast/gulf-war-fast-facts/index.html
-
https://www.cnbc.com/2018/11/14/us-has-spent-5point9-trillion-on-middle-east-asia-wars-since-2001-study.html
-
comments:
- Since world-war-two ended in 1945, the United States of America has lost every war but no one seems to care as long as the economy is humming along.
- Many Americans continually bang-on about the USA being a Christian nation but I wonder if they will ever turn their spears into pruning hooks as is mentioned in
Isaiah 2:3–4
What About China?
Back in the 1960s, conservatism revolved around business interests which caused some Americans to wonder if China was a missed business opportunity. This was the main reason
for Nixon and Kissinger
opening relations with China in 1972 which was trigger event
started the Chinese shift from agriculture to industrialism. (think of this as an industrial revolution confined to one country)
American companies and schools in the 1980s found themselves in the Reagan era of "minimal government" which now goes by the name austerity. This might have translated
into greater economic problems, including unemployment, except for the actions of China's leader, Deng Xiaoping,
who favored "maximal government" so was paying to send thousands of Chinese students to the USA every year to be educated.
I personally experienced this in the 1980s when I was studying as a computer engineer in Boston (on this one
occasion I was attending classes at: 12 Crosby Drive, Bedford, Massachusetts). Based upon a luck of the draw, our class engaged in "morning lectures" and "afternoon labs".
An English-speaking Chinese student sat one row ahead of me in the lecture hall accompanied by two minders who spoke very little English but were still required to pay for
student slots (IIRC, the price was $5k per week; these minders ensured the student would return to China; they passed the day in class by reading
little brown books of political dogma). Back then, Americans correctly welcomed these foreign students as business opportunity but none of them thought that China would
eventually compete head-to-head with the USA. I applaud the Chinese students who were able to acquire an education in a foreign country speaking a foreign language but
wonder how many Americans would be willing, or able, to do the same by traveling to China.
Comments:
- China is investing in their citizens while America is investing in their military which makes me wonder if America
will, one day, recognize "an education gap" much in the way they became mobilized by the Soviet launch of Sputnik
1 in 1957. Many IQ charts (including these two: here and here)
show average IQ levels in Hong Kong and Singapore (score: 109) are a full eleven points ahead of the USA (score: 98). Also
notice that countries with high levels of religious extremism tend to have lower IQs.
- since the 1980s, China has moved more
than 300 million Chinese citizens from the poor agricultural class into the middle class. This is approximately the same size as the total population of the USA
where "middle class" is becoming a fading memory.
- Since American educators showed China how to modernize, I can't seem to stop thinking about the story of "the Tortoise and the Hare". If Charles Dickens were alive, I
am certain he would be toying with writing a book titled "A Tale of Two Countries"
Steps to MAGA (Make American Great Again)
- ELIMINATE OFFENSE SPENDING
- The colored chart (28-lines above) provides proof that many millions of people would be alive today if the USA had not been funding foreign wars on foreign shores.
One way to make American great again is to stop funneling taxpayer money into those defense programs which, in realty, are offense programs used to support the
American empire
- FUND SCIENCE and TECHNOLOGY
- Like an addict that is unable to quit cold-turkey, defense contractors will probably not be able to survive having their government funds reduced. But just as
happened during the 1950s and 1960s, defense contractors could be employed by the government to do big science projects though civilian agencies like NASA
- During the 1990s, the American defense budget was always under $300
billion per year. By 2023 the defense budget had climbed to $785 billion per year. Meanwhile, NASA's budget has fluctuated between $17 to $20 billion since
2006. If NASA's budget was doubled by diverting money from the defense budget would the Pentagon even notice? And yet we know that spending on NASA will have a
positive ROI (return-on-investment)
- DARPA is on American defense investment with a very high ROI
- FREE EDUCATION
- There are still many people alive today who will tell you that they attended Harvard University in the 1950s and only paid $100.00 tuition
- Before defense contractors started receiving the bulk of government monies, the American government (via the Pentagon) used to subsidize all college and university
educations. Funding for education slowly dried up as money was diverted into the defense/offense budgets.
- Once money is diverted from offense back into education, the whole economy will repair itself
Epiphany 21: Microsoft is promoting Python3 (Python version 3.x)
2020-05-06: Today’s alert from ZNET informs that Microsoft has added 51 Python videos to their previous 44. I read somewhere that because Bill Gates had started programming
in BASIC, that Microsoft could never really get rid of it (other than the massive changes to "visual studio" between VS6 and VS.net where they made it play-nice with C/C++
and anything else that generated code under the .net framework). I wonder what Bill would say about the shift to Python?
“Python for beginners” playlist from Microsoft (this is the original 44)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhS8VzuMCfQD4uJ9yne1mE6
notes:
- video #4 shows how to configure Video Studio Code which can now be acquired and used FREE of CHARGE
- Video Studio Code can integrate with a GIT repository including GitHub
“More Python for beginners” (this is the new 20)
https://www.youtube.com/playlist?list=PLlrxD0HtieHiXd-nEby-TMCoUNwhbLUnj
“Even More Python for Beginners” (this is the new 31)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhHnCUVtR8UHS7eLl33zfJ-
Epiphany 22: Machine Learning is a real thing
- While my technical life has contained a lot of "holy crap" moments, there have only been a few times where my epiphany meter maxed out at 100%.
- Perhaps I am a bit late to this epiphany but there has been a quiet revolution happening in the area of machine learning starting in 2007 when the Python
library Scikit-learn was made freely available by Google. Since then, this field has exploded
with most work being performed by internet companies. Why? In machine learning "data generates code" rather than the other way around so using the enormous amount of data
available via the internet to train your machine results in better code solutions.
comment: in 2021 humanity discoverers that data (or information) is much more valuable than energy
- Then in 2012 Google announced a breakthrough in the ability to detect a cat (or not) anywhere in youtube videos (Google's training data).
- Since then, Alphabet/Google companies (DeepMind
is one) have made huge breakthroughs in goal oriented game-playing machines like AlphaGo (2016), and
AlphaZero (Go and Chess in 2016).
comment: here, the word "zero" means zero human input (usually achieved by forcing two machines to pay several million games with each other)
- Companies like Tesla employ as many as 1,000 people developing Machine Learning which the intend to use to implement Full Self Driving (FSD)
- comment: these systems are not yet capable of consciousness. It may be better to think about them as super neural networks (or super
spinal cords)
caveats: both
machine learning and
artificial intelligence have a lot of history going back to the 1940 and 1950s so contains a lot of
non-computer terminology. Modern computer technologists wishing to learn more might wish to start here:
Epiphany 23: 5G is still confusing some
Not sure why some people are still complaining about 5G. Is it ignorance or something else? Here's a thumbnail table of facts:
Gen |
Lifetime |
Features |
Frequency |
Data Speed |
Max Power |
0G |
|
non-cellular radiotelephone service: click here to learn more |
|
1G |
1980-1990 |
analog cellular |
30 KHz |
2 kbps |
|
2G |
1990-2000 |
digital cellular (GSM), text messaging |
1.8 GHz
|
64 kbps |
|
2.5G |
|
email, web, camera phones |
|
|
|
3G |
2000-2010 |
smart phones, video calls
CDMA, UMTS, EDGE |
1.6 - 2 GHz |
144 kbps – 2 Mbps |
|
4G |
2010-2020 |
LTE, WiFi
|
2 - 8 GHz
|
100 Mbps – 1 Gbps |
0.25 milliwatts |
5G |
2020-2030 |
wwww (world wide web wireless)
|
3 - 300 GHz
|
up to 1 Gbps |
100 milliwatts |
6G |
2030-???? |
click here
|
|
CB (Citizen Band) Radio (1958-present) |
27 MHz
|
|
4 watts
|
School room Laser Pointer
|
5 milliwatts
|
Hearing Aid
|
1 milliwatt
|
- Residential microwave ovens cook by heating water molecules with a frequency of 2.45
GHz which has nothing to do with any special resonance frequency of water molecules. Click here and here for more information about this public misconception.
- While residential microwave ovens typically cook at 1200 Watts, inverter-powered variants are able to reduce their cooking power in 10% decrements via the front panel
- Smart phones transmit much less power (4G=0.25mW max; 5G=100mW max) than microwave ovens but automatically reduce their transmitter power by continually measuring the
strength of the signal received from the cell tower (similar to you needing to speak louder at a noisy party). Reducing power also allows your phone to run longer before
needing a recharge.
- But if you are still concerned about holding a radio transmitter next to your brain then use "the hands free feature" or "ear buds" or "wired head phones" because the inverse-square law applies to radio waves as well as light.
- Caveat: "ear buds" are wireless bluetooth devices which work in the 2 - 2.4 GHz range but only
utilize a tiny amount of power (range is usually a few meters). If this still makes you nervous then use wired head phones.
- how do ear buds work? video (prepare to be shocked)
- how does Bluetooth work? video
- Many people have lived in the presence of CB radios ever since they were introduced in 1958. While the frequency of CB radio may be 10,000 times lower than 5G, notice
that the power output is 40 times higher.
Epiphany 24: big advancements happen every 10 years
Processor Hardware:
decade
|
technology
|
notes
|
1940s |
electromechanical |
think ENIAC and Enigma
|
1950s |
vacuum tube (known as valves to you Brits) |
Mainframe computers are now possible |
1960s |
discrete semiconductor (diodes and transistors) |
Minicomputers are now possible |
1970s |
integrated circuits |
Personal computers are now possible |
1980s |
VLSI (very large-scale integration) |
Workstations are now possible |
1990s |
ULSI (ultra large-scale integration) |
Graphics cards and large Game Consoles are now possible
ATI Technologies (1985) Nvidia
(1993) PlayStation (1994) |
2000s |
|
Graphics cards morph into Vector + Tensor processors
(this technology helped to enable Artificial Intelligence) |
notes: these are not hard dates. For example, Integrated circuits were invented in 1959 but only NASA, and the American military, could afford them
during the 1960s. While minicomputer processors might have been using integrated circuits in the 1960s these systems usually employed magnetic core memory. Once integrated
circuits were used to manufacture computer memory then personal computers became economically feasible.
Networking:
decade
|
technology
|
notes
|
1970 |
ARPAnet |
created in 1969 to connect universities
|
1980s |
internet |
created in 1982 to interconnect various networks |
1990s |
www (world-wide-web) |
created 1989-1991 to simplify document sharing at CERN
the public becomes infatuated www and email around 1995 |
2000s |
search and
cloud computing |
Google helps you locate data (2002)
AWS (Amazon Web Services) helps you store your data (2002)
CERN creates the world-wide-grid (this name never becomes popular) |
2010s |
social media |
Reddit (2005), Facebook (2007), Twitter (2008) |
2020s |
Artificial Intelligence |
Google restarts interest in Machine Leaning (2007)
Google's DeepMind beats humans at Go (2016) and Chess (2017)
Google invents a Machine Learning Transformer (2017)
OpenAI uses Google's Transformer in GPT3
(2020)
OpenAI releases ChatGPT in (2022-12) which does badly on things like LCAT
OpenAI releases ChatGTP4 four months later and it beats all humans on LCAT
caveat: almost all A.I. systems are trained by looking at data on the internet
|
notes: Artificial Intelligence may be as important today as was the invention of the internet back in the 1980s and probably cannot be stopped (well,
one country may stop development but would this apply to all others?)
speculation: Artificial Intelligence is nowhere close to Artificial Consciousness but will this always be the case? It has been estimated that the
human brain uses the equivalent power of 15-30 watts while IBM's Watson used
85KW to play Jeopardy which is ~ 4250 times higher. Couple this with the fact that AI and ML systems are programmed with huge data sets, then it
should not be surprising that these systems often detect medical problems that human doctors overlook. So what do we do when a generative AI makes recommendations for the
next generation of computer hardware? IMHO this is where "science and technology" meet "science fiction".
Epiphany 25: Star Trek (art imitates life?)
I have been a life-long fan of the whole Star Trek genre, and own a laminated poster titled "Everything
I've Ever Learned in Life, I learned from Star Trek". However, one weak plot point involves the
Borg from Star Trek: The Next Generation (a.k.a. ST:TNG) . I found it
absurd that "individual Borg ignored humans walking around the Borg ship". One common explanation says the Borg drones are preoccupied by the conversations of the "Borg
hive mind".
[[[ any good software engineer would have individual drones receive an emergency interrupt which would temporarily disconnect in the event of an emergency ]]]
Last week (2023-11-17) I noticed a lot of people at the local gym wasting time while exercising their thumbs scrolling on smart phones. I needed to get on one machine
that was occupied by a thumb-scroller, so I asked him if I could do a quick set between his. This guy was so lost in cyberspace that he didn't hear my request, and didn't
respond until I began waving my hands in his field of view. And that's when it hit me: People on social media are part of some sort of "human hive mind" (an
incredibly dumbed-down with social distractions)
Epiphany 26: Art imitates life, part-2
I just (Nov 2024) played
Metro Awakening on a PS5 with a VR2 headset (a.k.a. PSVR2)
- two OLED displays, one for each eye, with a resolution of 2000 x 2040 pixels per eye. A 3d effect is simulated by parallax (differences between the two displays)
- four cameras look outward at your surroundings.
- four cameras look inward at your eye-balls.
- gyros (I think) to measure your up-down head-tilt or left-right swivel (or is this performed in software by comparison with a reference image of your play area?)
- CAVEAT: "Metro Awakening" is not a sci-fi game, but when the VR2 initializes, it displays a horizontal and vertical line network, all superimposed on the walls, floor
and ceiling of of your play area. This should remind you of the holodeck in Star Trek: The Next Generation
External Links
Back to Home Neil Rieck
Waterloo, Ontario, Canada.