Brownstone Journal

To Share Weights from Neural Network Training Is Dangerous


Listen Later

By Jessica Rose at Brownstone dot org.
Some organizations and researchers are sharing neural network weights, particularly through the open-weight model movement. These include Meta's Llama series, Mistral's models, and DeepSeek's open-weight releases, which claim to democratize access to powerful AI. But doing so raises not only security concerns, but potentially an existential threat.
For background, I have written a few articles on LLMs and AIs as part of my own learning process in this very dynamic and quickly evolving Pandora's open box field. You can read those here, here, and here.
Once you understand what neural networks are and how they are trained on data, you will also understand what weights (and biases) and backpropagation are. It's basically just linear algebra and matrix vector multiplication to yield numbers, to be honest.
More specifically, a weight is a number (typically a floating-point value - a way to write numbers with decimal points for more accuracy) that represents the strength or importance of the connection between two neurons or nodes across different layers of the neural network.
I highly recommend watching 3Blue1Brown's videos to gain a better understanding, and it's important that you do. 3Blue1Brown's instructional videos are incredibly good.
Start with this one.
And head to this one.
The weights are the parameter values determined from data in a neural network to make predictions or decisions to arrive at a solution. Each weight is an instruction telling the network how important certain pieces of information are, like how much to pay attention to a specific color or shape in a picture. These weights are numbers that get fine-tuned during training thanks to all those decimal points, helping the network figure out patterns.
Examples include recognizing a dog in a photo or translating a sentence. They are critical in the 'thinking' process of a neural network.
You can think of the weights in a neural network like the paths of least resistance that guide the network toward the best solution. Imagine water flowing down a hill, naturally finding the easiest routes to reach the bottom.
In a neural network, the weights are adjusted during training on data sets to create the easiest paths for information to flow through, helping the network quickly and accurately solve problems, like recognizing patterns or making predictions, by emphasizing the most important connections and minimizing errors.
If you're an electronic musician, think of weights like the dials on your analog synth that allow you to tune into the right frequency or sound to say, mimic a sound you want to recreate, or in fact, create a new one. If you're a sound guy, you can also think of it like adjusting the knobs on your mixer to balance different instruments.
Weights are indeed dynamic parameters, meaning that they are very mutable as a neural network tends toward a prediction or solution. Each is also associated with its own bias. The bias's role is to shift the output, allowing the model to better fit the data by adding an offset that adjusts the decision boundary or pattern recognition, which is independent of the input scaling determined by weights.
Think of it like this. Imagine you're trying to recreate the sound of a guitar on your synth. The weights control how much of the string pluck or body resonance you hear. If what you hear is a bit flat, for example, the bias is like adding a tiny boost or shift - say, a warm undertone - to make it sound more like the real guitar. This helps the network fine-tune its 'ear' to find the right pattern without changing the main controls.
It basically just makes the model better at matching the data to the real.
As the network is exposed to data, it adjusts the weights through a process called backpropagation, tweaking them to minimize errors and improve predictions. Think of those paths of least resistance being reshaped with each training example, like a river carving out better channel...
...more
View all episodesView all episodes
Download on the App Store

Brownstone JournalBy Brownstone Institute

  • 5
  • 5
  • 5
  • 5
  • 5

5

10 ratings


More shows like Brownstone Journal

View all
KunstlerCast - Conversations: Converging Catastrophes of the 21st Century by James Howard Kunstler & Duncan Crary

KunstlerCast - Conversations: Converging Catastrophes of the 21st Century

437 Listeners

Peak Prosperity by Chris Martenson

Peak Prosperity

558 Listeners

PragerU 5-Minute Videos by PragerU

PragerU 5-Minute Videos

6,836 Listeners

The Tom Woods Show by Tom Woods

The Tom Woods Show

3,365 Listeners

Coffee and a Mike by Michael Farris

Coffee and a Mike

349 Listeners

The Delingpod: The James Delingpole Podcast by James Delingpole

The Delingpod: The James Delingpole Podcast

463 Listeners

American Thought Leaders by The Epoch Times

American Thought Leaders

1,175 Listeners

The Sharyl Attkisson Podcast by Sharyl Attkisson

The Sharyl Attkisson Podcast

1,817 Listeners

Trish Wood is Critical by Trish Wood

Trish Wood is Critical

178 Listeners

Unlimited Hangout with Whitney Webb by Whitney Webb

Unlimited Hangout with Whitney Webb

1,270 Listeners

THE MCCULLOUGH REPORT by Dr. Peter McCullough

THE MCCULLOUGH REPORT

2,485 Listeners

Sarah Westall - Business Game Changers by Sarah Westall

Sarah Westall - Business Game Changers

203 Listeners

America Out Loud PULSE by America Out Loud PULSE

America Out Loud PULSE

134 Listeners

Doc Malik by Ahmad Malik

Doc Malik

119 Listeners

The Tucker Carlson Show by Tucker Carlson Network

The Tucker Carlson Show

15,615 Listeners