Technological Change (in my life) part-2

edit: 2021-11-28 (this is a stream-of-consciousness thing)

Epiphany 10: Smart Phones are really Star Trek devices

Epiphany 11: "Moore's Law Revisited" and the 'Big Data' Revolution

note: The following was partly inspired by the book Thank You for Being Late (2016) by Thomas Freidman

The Chessboard

The Integrated Circuit (chip)

The Speed of Change

In response to the 1992 Russo-American moratorium on nuclear testing, the US government started ASCI (Accelerated Strategic Computing Initiative) in 1996 to provide a method for developing nuclear weapons in simulation. ASCI Red was delivered in 1997 and could execute 1.8 TeraFLOPS. It cost $55 million, was a little smaller than a tennis court, and required the equivalent power of 800 homes. In 2006 Sony released the PS3 which could also execute 1.8 TeraFLOPS at a cost $300 but only required the equivalent power of three 120W light bulbs.
Excerpt page-41 "Thank You for Being Late" (2016) Thomas Freidman

Big Data Revolution of 2007-2008

Epiphany 12: GPU's and Moore's Law

The original definition of Moore's Law stated that the number of transistors on a single chip would "double every year" while "costing the consumer the same amount".
note: after the first decade Gordon Moore revised the doubling period to every 2 years).

I remember reading articles in the mid-1990s claiming that Moore's Law would hit a limit in 2005 or 2006. The main reasons were attributed to manufacturing problems related to "the limits of photo lithography" and "gate sizes so small that electrons would be able to quantum tunnel across barriers thus rendering them conductors rather than semi-conductors"

Around 2004 Intel announced a change in direction away from "faster single CPUs" toward "single multi-core CPUs" for which they charged more money. Technically speaking, they avoided Moore's Limit by revising Moore's Law to not include the phrase "costing the consumer the same amount". So now we have an economic Moore's Law as well as a technical one.

Advances including FinFET technology, Tri-Gate technology, Gate-all-around (GAA) technology, and 3D IC Stacking have enabled the semiconductor industry to keep innovating. Most people reading this page will already be aware of the fact that the computing industry appears to be shifting from CPUs to GPUs. I was surprised to learn that graphics cards from Nvidia have beat the technical Moore's Law fore the last few product iterations.

Nvidia science cards (graphics cards without a video connector) break Moore's Law every year

Year Product Gate Size (nm)
2010 https://en.wikipedia.org/wiki/Fermi_(microarchitecture) 40, and 28
2012 https://en.wikipedia.org/wiki/Kepler_(microarchitecture) 28
2014 https://en.wikipedia.org/wiki/Maxwell_(microarchitecture) 28
2016 https://en.wikipedia.org/wiki/Pascal_(microarchitecture) 16, and 14
2017 https://en.wikipedia.org/wiki/Volta_(microarchitecture)
https://www.theregister.co.uk/2017/05/24/deeper_dive_into_gtc17/
12

Epiphany 13: The Second Renaissance (Storm before the Calm?)

The Renaissance was a period in European history, from the 14th to the 17th century, regarded as the cultural bridge between the Middle Ages and our modern world. It needs to be mentioned that most people at the time did not know they were in a renaissance.
  1. Before Johannes Gutenberg introduced printing to Europe in 1439, it is safe to say that Europe was effectively run as a Catholic theocracy centered in Rome. But Gutenberg's technology enabled an alternative view of Christianity to flourish which we now refer to as the protestant reformation. Many people thought it was their duty to reject this change which eventually plunged Europe into the Thirty Year's War (1618-1648). While many Europeans died unnecessarily, European culture survived. The main take away from Gutenberg's technology is this: because many more people were able to get access to books, many learned to read. Some even learned to question church authority.
  2. Galileo Galilei is best known for employing a telescope (a.k.a. Dutch Spy Glass) to observe four large moons of Jupiter orbit Jupiter rather than objects on Earth. The Cliffs Notes version of the story has him going up against the Vatican who clung to their belief that every heavenly body orbited Earth which was located at the center of god's universe. Galileo was found guilty in 1633 (this was an inquisition) and sentenced to live out the remaining years of his life under house arrest (he was too well known to be put to death). YOU WOULD THINK that Galileo could have better defended himself by convincing his inquisitors to look into the telescope eyepiece while it was pointed at the moons of Jupiter. In fact, he tried this but his accusers refused to look into it claiming the device was the work of the devil.
    It needs to be pointed out that without a telescope to extend human senses, it is very difficult to determine if "the sun orbits the earth" or "the earth orbits the sun" (although non-telescopic measurements by Tycho Brahe came very close but you needed to be a mathematician to interpret the results). This is the beginning of an era where technology was used to extend the limits of the human senses.
  3. Within ten years of Galileo's trial, some early scientists had begun to interchange telescopic lenses to become a microscope which then facilitated the discovery of bacteria and cell biology. Once again, technology had been employed to extend human senses into a new realm.
  4. There have been countless improvements in telescopes and microscopes since those days but it would appear that humanity had hit a new limit. I say "had" because the inventions of the smaller while more powerful computers, along with computer networking via the internet, have ushered in a new age of astronomy and biology where we have put electronic eyes on our machines. For example, computers associated with the Kepler Spacecraft are able to sense planetary occultation of distant stars. This is something human eyes could never do. Similarly, computers, robotic sequencers, and the internet enabled the first two Human Genome Projects as well as the Human Proteome Project

Observations/Speculations

  1. The internet actually gets its start in 1969 as ARPAnet but is never taken seriously by the world until 1991 when Tim Berners-Lee invented the World Wide Web to solve a document sharing problem at CERN. (yep, the American military invented the internet but European scientists made it useful to the world; It would appear that this new renaissance began in Europe as well)
  2. Many people have already claimed that the internet (er, web) will eventually enable more people to read (on a per capita basis) than did the efforts of Johannes Gutenberg and I think this is true but we're not there just yet.
  3. The downside of the internet is that it has facilitated communications between crazy people and their conspiracy theories. It has also facilitated a large crop of radio-like internet broadcasters which further divided the population into political tribes. I believe this is why we witnessed "couldn't happen" events like BREXIT, or the election of a real estate tycoon for President of the United States. On the flip side, society might not be as divided as we think: While the political left dukes it out with the political right, we observe ~ 50% of the people not voting. This means members of the political left or right only represent 25% each.
  4. This fight between left and right seems very reminiscent of the fight between Catholic and Protestant more than four centuries ago. While some people could not imagine a non-Catholic Europe, it was the children of the people who started the Thirty Year's War who put a stop to the madness by reminding everyone that Christianity was supposed to be a religion of non-violence; not war or inquisitions resulting in new ways to torture people to death. While 2019 marks 30 years since Tim Berners-Lee invented the web, we don't need to engage in a thirty year's war to put an end to our social madness. Just as Gutenberg introduced technology which enabled the advance of science while (eventually) reducing the importance of religious dogma, I wonder if the world-wide-web will make the dogma of political extremes a thing of the past.

Epiphany 14: Something in the water of Nordic-Scandinavian countries?

year who Birth what original work location notes
1985 Bjarne Stroustrup Denmark C++ New Jersey was looking to add object support to the C language
1987 Andrew S. Tanenbaum American MINIX Netherlands was looking for an alternative to UNIX®
1990 Guido van Rossum Netherlands Python Netherlands was looking for a successor to BASIC
1991 Linus Torvalds Finland Linux OS Finland was working on a smaller more-efficient kernel
1995 David Axmark
Michael Widenius
Sweden
Finland
MySQL Sweden was looking for an inexpensive SQL engine for PCs
1995 Rasmus Lerdorf Denmark PHP Waterloo, Canada was looking for a better CGI tool
2009 Michael Widenius Finland MariaDB Sweden was looking for an alternative to SUN-supported MySQL

Epiphany 15: Big Data renders first-past-the-post elections as irrelevant

This next section requires the reader to have some familiarity with a few terms
  1. Demographics and Demographic Analysis
  2. Psychographics
  3. Big Data
Facebook and 'Cambridge Analytica' - a summary of what happened
Wikipedia: American presidential election of 2016
Wikipedia: BREXIT vote of 2016
 
  1. Facebook sold access to their customer data. One of their business partners included Global Science Research
    • Social media outlets have always done this. It is how you get to use their platform free-of-charge.
    • This was not illegal -and- all Facebook users involved "clicked through" acceptance agreements which most people never read
  2. In 2014, Global Science Research co-director Aleksandr Kogan created a Facebook app / questionnaire called This Is Your Digital Life
    • in all versions of this story, this app was also used to collect your Facebook "Likes and Dislikes"
    • in many versions of the story, this app also collected the "Likes and Dislikes" of your FB friends
      comment: since those secondary level people were never presented with an acceptance agreement, this seem unethical if not illegal
  3. Aleksandr Kogan's company, Global Science Research, sold his questionnaire data to Cambridge Analytica, a data mining firm:
    • who sold information to AggregateIQ which was hired by the Leave Campaign prior to the British BREXIT referendum
    • now known to be working for the 2016 political campaigns of Ted Cruz and Donald Trump
  4. The results of the voluntary questionnaire were run through a psychographic analysis which resulted in the Facebook participants being slotted into 27 categories
    • It is now believed that Cambridge Analytica collected data on 87 million Facebook users
  5. Some of the categories identified Facebook people who...
    • never vote
    • always vote Republican (or conservative)
    • always vote Democratic (or liberal)
    • are political centrists -and- who might be convinced to vote one way or the other
  6. This last group was targeted with advertising (which may have masqueraded as fake news) with the intent of convincing some to vote in the desired direction (pro-Trump or pro-BREXIT) or just stay home. This work well in any a first-past-the-post (winner-take-all) election or plebiscite.
Comments - Observations
  1. Since all first-past-the-post elections can now be manipulated by big data technology combined with psychographics, democracies need to shift to proportional representation.
  2. Mark Twain once said “It’s easier to fool people than to convince them they have been fooled.”
  3. I am very worried about America's continual blaming of Russia and/or Putin. As long as Americans do this they will be blind to the effect of social media in the next election

Supporting Material

Epiphany 16: Industrial/Technical Evolution (part-1)

When you talk to anyone about the Industrial Revolution most only think about one big change between 1760 and 1860. But this is a gross oversimplification if you consider that it started with the age of steam, then transitioned to the age of electricity, then transitioned to the age of information. When we talk about technology in the information age should we begin with computers, or should we first start with getting information to people? (e.g. telegraph, telephone, radio, television, cable television, internet) You might even wish to begin with scrolls, newspapers, books)

The Evolution of Locomotives

steam (wood + coal) >> petroleum >> electricity

Thinking about locomotives for a moment, they began by burning wood to produce steam which was used to turn wheels. Europe quickly experienced wood shortages so locomotives switched over to coal (or any fossil fuel) with little difficulty. It is now known that humans burned petroleum for over 5,000 years but it wasn't until the mid-1800s that the industrial revolution commercialized petroleum extraction and refinement.

Steam locomotives eventually morphed into diesel locomotives where the fuel is burned in combustion engines to directly operate pistons (e.g. no intermediate steam is required). But the immense power was difficult to control via a mechanical transmission so diesel locomotives morphed into diesel electric systems where "a diesel engine runs an electrical generator" which is then used to power electrical motors.

At this point you can see that if an external electricity source is available then a locomotive might be made much more efficient (lighter) by doing away with the diesel engine and electrical generator.

The Evolution of Computers

Hardware

Software Systems

Observations:

  1. Improvements in either category (hardware or software) were not linear but logarithmic. Improvements in hardware and software together are multiplicative (think: logarithmic on steroids)
  2. Back in the late 1970s, a 16-bit minicomputer like the PDP-11/44 employed an 8085 to run the processor's console switches, LEDs, and serial interface. But some people in the computer industry would never get over the fact that microprocessors like the 8085 (or its descendants) would make the minicomputer processor obsolete
  3. Today I run into all kinds of people who just refuse to accept the fact that CLOUD-based systems are taking over from their favorite computer hardware and OS

Epiphany 17: What is old is new again (Python vs BASIC)

Epiphany 18: Industrial/Technical Evolutions (part-2)

Food-for-thought:
  • While there are no official dates, it appears to me that industrial revolutions happen almost every hundred years starting between 1760 and 1765 (on average)
  • It goes without saying that each revolution merges with previous revolutions (heavier-than-air travel first occurred in 1903 which firmly in the middle of the second industrial age)
  • will the third industrial age end earlier as we now enter the age of biology -OR- is biology jut another form of information technology which is much easier to research with computers, robotic sequencers, and the internet?
  • Companies like Google and Amazon Web Services are now more valuable than their fossil fuel equivalents which is proof that Information is more valuable than energy

The First Industrial Age (Age of Steam) years: 1760-1860

The Second Industrial Age (Age of Electricity) years: 1860-1960

The Third Industrial Age (Age of Digital Data / Age of Information) years: 1960-2010?

comment: so when politicians claim "they will make America Great again" or that "BREXIT can bring back lost jobs" those politicians are either lying or ignorant of the facts just presented. Truth be told:

Speculation: The Fourth Industrial Age (Artificial Intelligence, deep learning, and more?) years: 2010-now

Epiphany 19: The dominance of C/C++ (part 3/3)

In the early days of computing, most hardware manufacturers also created, then sold, software. Putting aside operating systems for a moment, those vendors also sold programming tools including programming language compilers like Fortran (1956), ALGOL (1958) and COBOL (1959). One of the most important changes to the computing industry occurred when computer professionals got together to standardize computer languages.

One measure of any computer language's success is, I think, the number of standardizations. For example, although the BASIC language was once very popular (I am still forced to use VMS-BASIC every day by my employer in 2019) it has gone through a very small number of standardizations. This might have something to do with the fact that many BASIC implementations were so different that standardization was not possible or, perhaps, desirable.  On the other hand, languages like C, C++ and Objective-C have gone though numerous standards and continue to be improved.

For example, non-standard "C" first appeared in 1972 and now referred to as K&R C after its authors, Brian Kernighan and Denis Ritchie. Improvements were formalized then published as C89 in 1989 by ANSI and C90 by ISO. This continued with the names C99, C11 and C18 as described here.

comment: It appears to "C" moves to a new standardization level approximately every 10 years (on average) whilst C++ moves to  a new level approximately every 3 years (on average)

Linux

Since 50% of all the digital devices on the planet "running an operating system" use some form of Linux, it should be no surprise that it is the Linux community that is pushing newer versions of C and C++

Since gcc is used to build Linux, we should look at this toolset a little closer

Oddities

Make(s)

One thing usually overlooked in this ecosystem begins with the Unix Make utility first written in 1977 by Stuart Feldman of Bell Labs. This was the easiest method to get an application running on another Unix implementation which may be running on a completely foreign computer architecture. I'm jumping a lot of steps here but after a time, SUN Microsystems thought that Make was getting a little long-in-the-tooth so developed a product called "Solaris Package Manager". It should surprise no one that Red Hat needed something in 1995 to move to Red Hat Linux 2 so further developed the SUN product calling it "Redhat Package Manager" (file extensions are '.rpm')

Epiphany 20: A few thoughts on the 50th anniversary of Apollo 11 landing on the moon

The manned spacecraft programs in both Russia and the United States changed the world in more ways than most people would ever know. First off, humanity cannot ignore the contribution of the Soviet space program because the United States would have never attempted such a thing if it were not for cold-war politics. The plain fact is this: many Americans didn't care about Apollo but did care about beating the Russians at something. History informs that President Richard Nixon and Henry Kissinger were plotting to terminate the Apollo Program at the very time that Nixon was congratulating, via telephone from the Whitehouse, Apollo 11 astronaut Neil Armstrong who was standing on moon (both Nixon and Kissinger thought that Vietnam was a more urgent issue; more on this in a moment). To add insult to injury, many Americans lost interest after the apparently routine flight of Apollo 12 (e.g. we won that race twice; time to move on). Proof of this can be seen in the number of citizens who complained to the TV networks about favorite TV programs being preempted by spaceflight coverage. Heck, in-flight video transmissions from Apollo 13 were not aired until after a tank explosion placed the mission in jeopardy. Part of this public disinterest led to cancellations of Apollo flights 18-20.

FACT: development of the Apollo Guidance Computer (AGC) was the trigger event for the largest amount of human technological progress. The design work was done by Draper Labs at MIT while manufacturing was done by Raytheon. Why was the AGC necessary? Initially, many astronauts and cosmonauts incorrectly thought that human pilots would be able to directly fly spacecraft much in the same way that pilots flew aircraft. Consider this thought experiment: you are flying the Lunar Module and need to catch-up-to, then dock with, the Command Module.  For simplicity, assume that both vehicles have identical orbits and velocities but are separated by a distance of 1000 m (3280 f). Without thinking about life outside of the atmosphere, you fire your RCS thrusters (force: 100 pounds or 444 Newtons) while aiming at the Command Module. This will increase your velocity which pushes you into a higher orbit. Your new orbital velocity is faster but your orbital time is now slower. This causes the Command Module to quickly pass under you making it impossible to dock. One correct solution dictates that you should fire your RCS thrusters away from the target vehicle, which will cause you to drop into a slightly lower orbit; then wait a short period of time; then fire your RCS thrusters in the opposite direction which should return you to the original orbit as the CM but hopefully much closer (BTW, first firing forward then quickly firing backward produces the same result). Remember "F = ma" from Isaac Newton's second law? Since "a = dV/dT" then the second law can be rewritten as "F = m x dV/dT" which becomes "F x dT = m x dV". (the left side of the equation is known as impulse). The instantaneous mass of the LM (which decreases every time you fire your thrusters) determines how long you should fire them in every maneuver (e.g. two one-second thrusts will not produce identical results; a one-second forward burn cannot (exactly) be cancelled by a one-second reverse burn). These real-time calculus solutions are best determined by a guidance computer because fuel is limited so must be conserved.

How did the development of the AGC improve things here on Earth? First off, commercial mainframe computers in the 1960s were manufactured from discrete electronic components, including individual transistors and diodes. So when IBM learned that the AGC computer had to fit into a volume the size of a bread-box (one cubic foot or 28,316 cc) many IBM engineers didn't think it was possible. The Draper/Raytheon solution employed "integrated circuits" (semiconductor chips containing numerous transistors) which they were already using in a more primitive way inside Polaris missile guidance systems. The high per-component prices meant that the American government was their primary customer (Apollo consumed 60% of the IC developed by America in 1966). Because of high cost, government contractors developed semiconductor test methods to ensure that the government would only pay for components that met design specifications. These testing methods eventually migrated from the customer (government) back to the manufacturing industry which resulted in affordable chips for the rest of us. That revolution in chip manufacturing produced things like:

comment: Apollo used 3-input NAND gates manufactured by Fairchild Semiconductor. Engineers leave to form Intel then later, AMD

Software jobs also changed drastically during this time. While it is true that high-level programming languages like FORTRAN (1957) and COBOL (1959) existed, the phrase "computer programmer" did not yet exist as computer programming was primarily done by mathematicians. High-level languages required more memory and CPU power then what was available on the AGC, but they were employed on mainframe computers used "to run AGC flight simulations" then "generate the necessary binary code" for the hand-made read-only core rope memory used to hold AGC programs. The level of redundancy built into the AGC programs (see reference-1) should inform that these people were doing "computer engineering". Click Margaret Hamilton to see what I mean.

Critics of Apollo mention that the program was too expensive in that it was consuming too much of the national budget with 4% being the oft quoted number. Let me remind everyone that cold-war concerns at the time mandated that the Defense budget was kept secret. On top of that, no American citizen knew how much money was being used to support the Vietnam War. Today we know that the total cost of Apollo (in 1968 dollars) was $25 billion whilst the cost of the Vietnam War (also in 1968 dollars) was $168 Billion dollars. Now everyone knows that it is way harder to create something than it is to destroy something so allow me to state the obvious: America got no return on the $168 billion investment. Check out the next chart then advise your political representatives accordingly:

Activity Cost Human Cost RIO
(return on investment)
Notes
Apollo Manned Spacecraft Program $25 Billion
(1968 dollars)
3 astronauts advances in metallurgy
advances in semiconductor technology
advances in computer engineering
advances in software engineering
invention of the internet
admiration of the whole world
During the peak years, the Apollo program employed approximately 400,000 scientists, engineers and technicians across 20,000 companies. Much of this work was done by, or managed by, defense contractors.

Many Americans continue to bang on about the USA being a Christian nation but I wonder if they will ever turn their spears into pruning hooks as is mentioned in Isaiah 2:3–4
Vietnam War $168 Billion
(1968 dollars)
58,000 US soldiers killed
200,000 US soldiers injured
2 million Vietnamese civilians
1.1 million North Vietnamese fighters
200,000 South Vietnamese soldiers
50,000 Civilians in Laos(up to 1973)
259,00 Civilians in Cambodia
agent orange
contempt of the whole world
During some peak years, more than 400,000 American soldiers were committed to Vietnam (almost the same number of people tied to the manned spaceflight effort).

Despite hearing crazy political justifications like "the domino theory", America lost this war but no literal or metaphorical dominos were ever observed.
First Gulf War $61 Billion 382 US military
others: casualties
American defense contractors do well first use of "depleted uranium" by the Americans
Middle-East Wars $5.9 Trillion ??? American defense contractors do well Hopefully everyone reading this knows that "1 trillion" = "1,000 billion"
First use of: Extraordinary rendition
war references:
  1. https://fas.org/sgp/crs/natsec/RS22926.pdf (published by: Congressional Research)
  2. https://en.wikipedia.org/wiki/Vietnam_War_casualties
  3. https://www.irishtimes.com/news/world/asia-pacific/death-from-below-in-the-world-s-most-bombed-country-1.3078351
  4. https://www.cnn.com/2013/09/15/world/meast/gulf-war-fast-facts/index.html
  5. https://www.cnbc.com/2018/11/14/us-has-spent-5point9-trillion-on-middle-east-asia-wars-since-2001-study.html
  6. comment: Since world-war-two ended in 1945, the United States of America has lost every war. But no one seems to care as long as the economy is humming along. 

What About China?

Back in the 1960s, conservatism revolved around business interests which caused some Americans to wonder if China was a missed business opportunity. This was the main reason for Nixon and Kissinger opening relations with China in 1972. IMHO this trigger event started the Chinese shift from agriculture to industrialism. (think of this as an industrial revolution confined to one country)

American companies and schools in the 1980s found themselves in the Reagan era of "minimal government" which now goes by the name austerity. This might have translated into greater economic problems, including unemployment, except for the actions of China's leader, Deng Xiaoping, who favored "maximal government" so was paying to send thousands of Chinese students to the USA every year to be educated.

I personally experienced this in 1985 Boston: we had morning lectures and afternoon labs. An English-speaking Chinese student sat one row ahead of me in the lecture hall accompanied by two minders who could not speak English but were required to pay for student slots (these minders were there to ensure the student would return to China; they passed the day in class by reading little brown books of political dogma). Back then, Americans correctly welcomed these foreign students (it was a business opportunity) but no one ever thought that China would eventually compete head-to-head with the USA. I applaud the Chinese students who were able to acquire an education in a foreign country speaking a foreign language but wonder how many Americans would be willing, or able, to do the same by travelling to China.
Comments:

Steps to MAGA (Make American Great Again)

  1. ELIMINATE OFFENSE SPENDING
    • The colored chart (28-lines above) provides proof that many millions of people would be alive today if the USA had not  been funding foreign wars on foreign shores. One way to make American great again is to stop funneling tax payer money into those defense programs which, in realty, are offense programs used to support the American empire
  2. FUND SCIENCE and TECHNOLOGY
    • Like an addict that is unable to quit cold-turkey, defense contractors will probably not be able to survive having their government funds reduced. But just as happened during the 1950s and 1960s, defense contractors could be employed by the government to do big science projects though civilian agencies like NASA
    • During the 1990s, the American defense budget was always under $300 billion per year. By 2010 the defense budget had climbed to $721 billion per year.  Meanwhile, NASA's budget has fluctuated between $17 to $20 billion since 2006. If NASA's budget was doubled by diverting money from the defense budget would the Pentagon even notice? And yet we know that spending on NASA will have a positive ROI (return-on-investment)
    •  DARPA is on American defense investment with a very high ROI
  3. FREE EDUCATION
    • There are still many people alive today who will tell you that they attended Harvard University in the 1950s and only paid $100.00 tuition
    • Before defense contractors starting receiving the bulk of government monies, the American government used to subsidize all college and university educations. Funding for education slowly dried up as money was diverted into the defense/offense budgets.
    • Once money is diverted from offense back into education, the whole economy will repair itself

Epiphany 21: Microsoft is promoting Python3 (Python version 3.x)

2020-05-06: Today’s alert from ZNET informs that Microsoft has added 51 Python videos to their previous 44. I read somewhere that because Bill Gates had started programming in BASIC, that Microsoft could never really get rid of it (other than the massive changes to "visual studio" between VS6 and VS.net where they made it play-nice with C/C++ and anything else that generated code under the .net framework). I wonder what Bill would say about the shift to Python?

“Python for beginners” playlist from Microsoft (this is the original 44)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhS8VzuMCfQD4uJ9yne1mE6
notes: “More Python for beginners” (this is the new 20)
https://www.youtube.com/playlist?list=PLlrxD0HtieHiXd-nEby-TMCoUNwhbLUnj 

“Even More Python for Beginners” (this is the new 31)
https://www.youtube.com/playlist?list=PLlrxD0HtieHhHnCUVtR8UHS7eLl33zfJ-

Epiphany 22: Machine Learning is a real thing


NumPy Library to add array support to Python2
Also contains routines for doing matrix mathematics
https://en.wikipedia.org/wiki/NumPy
Community - 2006
scikit-learn first generation learning library https://en.wikipedia.org/wiki/Scikit-learn Google Summer of Code - 2007
TensorFlow second generation learning library
https://en.wikipedia.org/wiki/TensorFlow
Google - 2015
Keras second generation learning library https://en.wikipedia.org/wiki/Keras
Google - 2015
PyTorch second generation learning library https://en.wikipedia.org/wiki/PyTorch
Facebook - 2017
caveats: both machine learning and artificial intelligence have a lot of history going back to the 1940 and 1950s so contains a lot of non-computer terminology. Modern computer technologists wishing to learn more might wish to start here:

External Links


Back to Home
Neil Rieck
Waterloo, Ontario, Canada.