Chat GPT – the new Personal Computer? a look back at historical patterns

Posted by:

|

On:

|

Companies today are using various GPTs to create models that allow users to interact in a conversational manner. These models are tailored machine learning architectures trained to perform tasks that would otherwise require significant human effort. Many people are concerned about the potential impacts of these emerging technologies on both their personal and professional lives. Although the initial costs to develop these tools are substantial, as new tools and investments emerge, the entry costs will significantly decrease. AI technologies will continue to evolve, providing efficiencies in various aspects of life.

AI: The new computer chip? cost of entry over time


The threat of Artificial Intelligence replacing a significant number of jobs is quite real. Estimates from the Massachusetts Institute of Technology suggest that about 20% of jobs could be made more efficient or eliminated by conversational AI. These estimates only take into account the current environment where the high startup costs of AI significantly hinder scalability. However, as investments continue in machine learning and conversational AI technologies, costs may continue to drop. As more investments are made and discoveries occur, entry costs into the AI space will decrease, potentially replacing more jobs.
A lesson from Moore’s Law (coined by Gordon Moore, co-founder of Intel) is particularly relevant here. He observed that the number of transistors on an integrated circuit doubled approximately every two years between 1960 and 1965. Could AI be on a similar trajectory? Below is historical data on memory prices from 1957 to 1970:

Key Data Points:

From 1957 to 1970, the cost of memory technology saw a dramatic reduction, highlighting a fundamental aspect of technological evolution: as technology matures, the cost of its components tends to decrease significantly.

  • 1957: Memory cost approximately $411,041,792 per megabyte, using transistor flip-flop technology.
  • 1959: Just two years later, the cost dramatically decreased to $67,947,725 per megabyte with the introduction of vacuum tube flip-flop technology.
  • 1960: The advent of IBM 1401 core memory further slashed prices to $5,242,880 per megabyte.
  • 1965: Costs fell to $2,642,412 per megabyte with IBM 360/30 core memory.
  • 1970: By this year, costs had plummeted to $734,003 per megabyte with IBM 370/135 core memory.

Implications for Conversational AI Technologies:

Similar to the advancements in memory technology, conversational AI technologies are likely on a parallel cost reduction trajectory. Initially, conversational AI systems are expensive due to research, development, and implementation costs. However, as the technology advances, the costs associated with computing power, data storage, and AI models are expected to decrease. This will make conversational AI more accessible and feasible across various applications, potentially mirroring the cost reduction seen in memory technology.
The historical reduction in memory costs provides a hopeful analogy for the future of conversational AI. As AI technology continues to evolve, we can anticipate similar reductions in costs, which will expand the technology’s applications and accessibility. The lessons from the past are clear—technological advancements not only bring enhanced capabilities but also reduced costs, making what was once a luxury eventually commonplace. We are here to investigate which GPT tools bring the most value to us.

My goal is to foster a user-led community focused on leveraging different GPTs to enhance all aspects of life, allowing us to concentrate on areas that AI cannot replace. If you are looking to bootstrap your small business and secure your long-term career, you are in the right place.

Posted by

in