What is Artificial Intelligence (AI)?

Jump to: navigation, search

YouTube ... Quora ...Google search ...Google News ...Bing News

Artificial Intelligence (AI) is a broad field of computer science that focuses on creating intelligent machines that can perform tasks that typically require human-like intelligence, such as problem-solving, decision-making, and perception.

  • Machine Learning (ML) is a subset of AI that focuses on creating algorithms that can automatically learn and improve from experience without being explicitly programmed. In other words, ML algorithms can learn from data and improve their performance over time without human intervention.
    • Deep Learning a type of machine learning that uses artificial neural networks to enable digital systems to learn and make decisions based on unstructured, unlabeled data. It is a subset of Machine Learning (ML) and essentially a neural network with three or more layers. These neural networks attempt to simulate the behavior of the human brain, allowing it to “learn” from large amounts of data.
      • Generative AI creating original content, including text, images, video, and computer code, by identifying patterns in large quantities of training data.


Artificial intelligence is rapidly developing, but how does it work? Experts explain. | NBC News
For many of us, the term artificial intelligence conjures up images of science fiction movies. But what is it really? As AI technology becomes a bigger part of our world, Lester Holt sits down with Tristan Harris and Aza Raskin, co-founders of the Center for Humane Technology, to talk about how it works.

"Godfather of artificial intelligence" talks impact and potential of new AI | CBS Mornings
Geoffrey Hinton is considered a godfather of artificial intelligence, having championed machine learning decades before it became mainstream. As chatbots like ChatGPT brings his work to widespread attention, Brook Silva-Braga spoke to Hinton about the past, present and future of AI.

In the Age of AI (full film) | FRONTLINE
A documentary exploring how artificial intelligence is changing life as we know it — from jobs to privacy to a growing rivalry between the U.S. and China. FRONTLINE investigates the promise and perils of AI and automation, tracing a new industrial revolution that will reshape and disrupt our world, and allow the emergence of a surveillance society.

What is Artificial Intelligence Exactly?
Hi, welcome to ColdFusion (formerly known as ColdfusTion). Experience the cutting edge of the world around us in a fun relaxed atmosphere.

Per section 238(g) of the John S. McCain National Defense Authorization Act, 2019 (Pub. L. 115-232) August 30, 2018, in which AI is defined to include the following:

  1. Any artificial system that performs tasks under varying and unpredictable circumstances without significant human oversight, or that can learn from experience and improve performance when exposed to data sets;
  2. An artificial system developed in computer software, physical hardware, or other context that solves tasks requiring human-like perception, cognition, planning, learning, communication, or physical action;
  3. An artificial system designed to think or act like a human, including cognitive architectures and neural networks;
  4. A set of techniques, including machine learning, that is designed to approximate a cognitive task; and
  5. An artificial system designed to act rationally, including an intelligent software agents or embodied robot that achieves goals using perception, planning, reasoning, learning, communicating, decision making, and acting.

Posted on December 27, 2017 by Swami Chandrasekaran


Age of AI: Everything you need to know about artificial intelligence | Devin Coldewey - TechCrunch

  • Diffusion: models are trained by showing them images that are gradually degraded by adding digital noise until there is nothing left of the original. By observing this, diffusion models learn to do the process in reverse as well, gradually adding detail to pure noise in order to form an arbitrarily defined image.
  • Fine tuning: models can be fine tuned by giving them a bit of extra training using a specialized dataset
  • Foundation Model: are the big from-scratch ones that need supercomputers to run, but they can be trimmed down to fit in smaller containers, usually by reducing the number of parameters.
  • Generative AI: an AI model that produces an original output, like an image or text. Some AIs summarize, some reorganize, some identify, and so on — but an AI that actually generates something (whether or not it “creates” is arguable)
  • Hallucination: Originally this was a problem of certain imagery in training slipping into unrelated output, such as buildings that seemed to be made of dogs due to an an over-prevalence of dogs in the training set. Now an AI is said to be hallucinating when, because it has insufficient or conflicting data in its training set, it just makes something up.
  • Inference: stating a conclusion by reasoning about available evidence. Of course it is not exactly “reasoning,” but statistically connecting the dots in the data it has ingested and, in effect, predicting the next dot.
  • Large Language Model (LLM): are trained on pretty much all the text making up the web and much of English literature. Ingesting all this results in a foundation model (read on) of enormous size. LLMs are able to converse and answer questions in natural language and imitate a variety of styles and types of written documents, as demonstrated by the likes of ChatGPT, Claude, and LLaMa.
  • Model: he actual collection of code that accepts inputs and returns outputs. Whatever it does or produces. Models depend on how the model is trained.
  • Neural network: they are just lots of dots and lines: the dots are data and the lines are statistical relationships between those values. As in the brain, this can create a versatile system that quickly takes an input, passes it through the network, and produces an output. This system is called a model.
  • Training: the neural networks making up the base of the system are exposed to a bunch of information in what’s called a dataset or corpus. In doing so, these giant networks create a statistical representation of that data. Words or images that must be analyzed and given representation in the giant statistical model. On the other hand, once the model is done cooking it can be much smaller and less demanding when it’s being used, a process called inference.