Call us we love Questions ~ 628.261.2661
ARTHUR C. CLARK - THE THREE LAWS
From "Hazards of Prophecy, The Failure of Imagination"
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
Any sufficiently advanced technology is indistinguishable from magic.
"As three laws were good enough for Newton, I have modestly decided to stop there". Arthur C. Clarke
Isaac Asimov's Corollary to Clarke's First Law: "When, however, the lay public rallies round an idea that is denounced by distinguished but elderly scientists and supports that idea with great fervor and emotion – the distinguished but elderly scientists are then, after all, probably right.
GENERATIVE PRE-TRAINED TRANSFORMER
The Generative Pre-Trained Transformer (GPT) system is able to transform one thing into another without losing the MEANING, just like you can!
Look at the following sentence:
She poured water from the Pitcher to the Cup until IT was Full
What is IT referring to in the sentence?
Look at this sentence:
She poured water from the Pitcher to the Cup until IT was Empty
What is IT referring to in the sentence?
Even though the sentence uses IT with a different meaning you do not get lost.
Through your Intelligence, you are able to Transform the meaning correctly.
The CHAT GPT system through its Artificial Intelligence is able to do exactly the same transformation. It is able to Transform the sentence in a variety of different ways and generate a variety of different outputs without losing the meaning. This is the first time in human history that a machine could perform this trick.
YOUR UNDIVIDED ATTENTION
In the podcast, Your Undivided Attention, co-hosts Tristan Harris and Aza Raskin explore the unprecedented power of emerging technologies: how they fit into our lives, and how they fit into a humane future.
Join us every other Thursday as we confront challenges and explore solutions with a wide range of thought leaders and change-makers — like Audrey Tang on digital democracy, neurotechnology with Nita Farahany, getting beyond dystopia with Yuval Noah Harari, and Esther Perel on Artificial Intimacy: the other AI. more here...
ATTENTION IS ALL YOU NEED
The paper from Google that introduced a groundbreaking architecture known as the Transformer, as in Generative Pretrained Transformer or ChatGPT, which revolutionized the field of sequence modeling. This architecture relies heavily on the concept of self-attention, allowing it to capture dependencies between different positions in the input sequence. Watch the Video or Read the Paper Here...
WHAT IS A TRANSFORMER IN AI & MACHINE LEARNING
A Transformer is a type of neural network architecture.
The key innovation of the transformer is the self-attention mechanism, which allows the model to attend to different parts of the input sequence when making predictions. This is in contrast to traditional recurrent neural networks (RNNs) which process the input sequence sequentially, and convolutional neural networks (CNNs) which apply filters across the entire input.
VERIFY STEP by STEP - IMPROVING LANGUAGE MODEL REASONING WITH PROCESS SUPERVISION
This video provides a detailed analysis of the "Step by Step" research paper from OpenAI. The authors explore how large language models perform complex multi-step reasoning tasks and discuss how these models can be made more reliable. Two key methods are compared in the study: outcome supervision and process supervision.
Protecting Sensitive and Personal Information
When prompting an AI, avoid sharing sensitive or personal information, such as personal client data (name, address, phone, DOB, SSN, drivers license etc.), financial records, passwords in order to protect client privacy and to comply with appropriate regulations. Be mindful of how your request is phrased to prevent generating harmful, biased, or misleading content, and always review AI outputs for accuracy before sharing or acting on them.