forbestheatreartsoxford.com

AI and Quantum Technologies: Bridging Misunderstandings

Written on

A communication technology influenced by early twentieth century physics

In recent years, the media has relentlessly focused on the various capabilities and potential dangers posed by artificial intelligence (AI). This attention has coincided with an increase in public figures sharing their perspectives, ranging from optimism to dire warnings about humanity's future (just take a look at YouTube). Concurrently, there's been a notable shift toward a growing interest in quantum technologies. The perception of contemporary AI as the endpoint of a gradual technological evolution has emerged, rooted in concepts of symbolic logic that have long been a topic of interest in theoretical computer science, while also serving as a thought experiment in the realms of cognitive science and philosophy of mind.

Advancements in processing power and storage capabilities, exemplified by cloud computing, alongside progress in computational linguistics and natural language processing, have transformed algorithms designed for processing extensive language models. These models, shaped by data accumulated over years of social media usage, combined with the ability of neural networks to perform multi-dimensional data processing and basic decision-making informed by prior experiences, suggest that theoretical ideas once confined to thought experiments may now have the potential to turn our internet-connected world into a vast experimental laboratory.

The previously distinct fields of research have converged into AI through the mechanism of machine learning. For those seeking a deeper understanding of the reality versus the hype surrounding AI, I recommend exploring industry perspectives here and academic viewpoints here. Current policy discussions surrounding AI and quantum technologies—often presented as separate entities—reveal intertwined technical heritages that warrant closer examination.

To begin with a historical lens, the foundational ideas of AI emerged around the same time as those that led to the development of quantum technologies. Theoretical and philosophical explorations into the capabilities of physical sciences, influenced by several scientific revolutions, paved the way for a mechanical revolution aimed at automating labor, thus addressing the demands of the first Industrial Revolution in mechanical capabilities and computational efficiency.

The late 19th and early 20th centuries witnessed significant advancements in electricity, magnetism, heat, and optics, bolstered by progress in abstract and geometrical algebra, leading to the establishment of theories of relativity and quantum mechanics. Some modern physicists assert that the quantum information framework offers the most accurate interpretation, suggesting that our physical reality may largely consist of quantum phenomena, akin to the largely invisible dark matter and energy that constitute our universe.

Stepping back to the interwar period, the pioneers of quantum mechanics—approximately 2.5 generations of scientists—were not only engaged in abstract discussions regarding axioms and interpretations but were also intrigued by how their findings could be applied to diverse fields such as biology, psychology, and computing.

Notably, Erwin Schrödinger explored the implications of his work in physics on biology, particularly genetics, while John von Neumann viewed biophysics and neuroscience as potential paths toward creating more advanced computational systems. The contributions of these individuals and their contemporaries played a significant role in ushering in the Atomic Age and the technological advancements related to thermonuclear fission.

Less commonly discussed are Schrödinger and von Neumann's interests in consciousness and intelligence, as well as operational research that intersects biological and computational domains. By the time World War II approached, the second Industrial Revolution had been well underway, and the impending third Industrial Revolution was on the horizon. The need to operationalize even the most abstract scientific concepts for wartime purposes and national prestige became urgent, with cybernetics emerging as a promising avenue. It is within this realm that quantum physics and AI began to converge.

In a book published in 1989 (with a revised edition in 2016), mathematician and physicist Roger Penrose posited that quantum mechanics could be crucial to understanding human consciousness. He argued that our grasp of AI is limited by our association of its development with classical computing paradigms. If AI's premise involves mimicking human cognition and awareness, should we reconsider the emphasis placed on algorithmic learning and big data management as true representations of AI, or should we redefine the term to align with contemporary advancements driven by algorithmic training methods?

Penrose's critique implies that a genuine AI system may need to be non-deterministic and probabilistic, reflecting the nature of quantum mechanics. This raises the question of whether the excitement surrounding the emergence and application of quantum technologies and AI has led us to overlook the distinction between their conceptual and technological realities. I will refrain from diving into the adequacy of current approaches to responsible AI and quantum technologies or whether treating them as separate entities is justified, given their divergent social histories. Instead, I will focus on a shared constraint that has influenced their technological evolution, dividing my analysis into two sections. The first concerns theoretical frameworks versus practical implementation, while the second addresses problem-solving approaches.

In the context of theoretical frameworks versus practical implementation, we must consider the challenge of translating complex mental constructs and theoretical models into scalable, actionable ideas requiring numerous iterations. As early as 1929, philosopher Hans Reichenbach noted that technology tends to view science merely as a resource, with scientific inquiries driven by instrumental needs to address real-world challenges. When faced with the uncertainty of quantum physics, technologists often prioritize tangible measurements, seeking areas that can be quantified, even within a probabilistic framework.

Given that a functional quantum computer is still a future prospect, current systems can only emulate quantum computing—much like how we can simulate older operating systems to play vintage games on modern machines. Additionally, as I will discuss in upcoming articles, certain quantum principles are already integrated into devices that operate solely on classical physics. Even when quantum computers become commonplace, classical systems will likely persist, albeit reconfigured to leverage previously dormant capabilities in circuitry design, processing technology, and data management. Furthermore, if we liken a quantum computer to a jet engine, we might question the necessity of deploying such advanced technology for simple tasks, like a quick trip to the grocery store.

The second consideration involves identifying the intended users and objectives of quantum computing technologies. At present, a gap persists between fundamental research in quantum computing and its practical applications within industry. However, the significance of quantum physics in life sciences is gaining recognition, with an emerging community of interdisciplinary researchers engaged in quantum biological studies. This field aims to navigate the limitations of exploring biological systems at scale by venturing into uncharted territories, specifically at a finer level than molecular structures.

Technologies within the life sciences that necessitate heightened precision and refinement will greatly benefit from the application of quantum physics, particularly in areas like drug delivery and chemical transport. My own research into nuclear technology has explored the application of irradiation technology for mutagenesis in agricultural practices within developing economies, illustrating that such applications are both practical and not overly reliant on high-tech solutions. However, the process of irradiation in this instance resembles a black box, where the outcome is achieved through trial and error. In their book, biologist McFadden and physicist al-Khalili suggested that a deeper understanding of genetic mutations requires knowledge at the quantum scale. Additionally, a 2021 report from McKinsey indicated that quantum computing could revolutionize the automotive and financial sectors through optimization at both algorithmic and computational levels, similar to the traveling salesman problem, which originated in AI and revolves around process optimization. The parallels drawn between quantum technologies and AI illustrate that our current perception of AI, though appearing deterministic, is shaped by its instrumentalization to solve problems rather than the foundational inquiries that originally motivated its development.

The distinction between AI and quantum technologies from a policymaking perspective stems from their origins in technological rather than ontological frameworks, resulting in their classification as separate entities. While this approach is not inherently flawed, it raises concerns about whether current policies adequately address the divide between the theoretical foundations of these technologies and the problem-solving frameworks that have shaped their evolution. The focus on ethical, legal, and policy implications of both technologies has largely been influenced by a value-belief-norm (VBN) shaped by a specific form of instrumental rationality, adopting a linear perspective on how anticipated outcomes dictate the processes for achieving them.

Much remains unexplored in our current technological landscape, and approaching these topics through narrative design and branching stories could reveal remarkable insights. Throughout this exploration, I will discuss various strategic directions, whether from policy or industry perspectives, while continuously returning to themes of justice, ethics, and shared responsibilities.

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

The Art of Manipulating Matter: A Reflection on Our Reality

Explore our unique ability to influence the physical world and the deeper implications of this power.

Token Distribution Explained: Understanding the Process and Methods

Explore the concept of token distribution in blockchain, its methods, and how it impacts digital asset ownership and value.

Empowering Your Mindset: Rev. Ike's Insights on Well-Being

Discover Rev. Ike's teachings on fostering a positive mindset for mental health and personal growth.

Creating Custom Random Number Distributions with Python

Learn how to generate random numbers from any probability distribution using Python.

Redundant Technology in Smart TVs: A Closer Look at Streaming

Analyzing the redundancy of Smart TVs and streaming devices in modern households.

Magical Makeovers: How Augmented Reality is Changing Interiors

Discover how augmented reality is revolutionizing the way we design and visualize our living spaces, making interior design accessible to everyone.

Essential Considerations When Choosing a Laptop as an IT Engineer

Key factors to consider when selecting a laptop, especially from the perspective of an IT engineer.

The Journey of Self-Discovery: Lessons from Life's Mountains

Explore the profound lessons learned through self-reflection and personal growth.