The AI thread

So, once downloaded I could run it offline, no internet connection required? except for (maybe) DRM?

Cool. I did not know that.

2 Likes

That is my understanding of how they work. Supporting evidence: They sure seem to heat up my GPU, and the kids don’t complain about them jamming up the internet.

But I haven’t put a packet tracker on the PC to see exactly what it’s doing
 in any case I’d be more worried about Ollama sending my data to Meta than the individual LLM sending things away?

Running it with the cable pulled does seem like a good idea, at least for anything truly personal, and that’s easy enough for me to do with my network so I might look into that! :+1:

3 Likes

Exactly. The models are basically just a neural networks that consist of the structure definition and the weights of the connections between the neurons. The size of the downloadable models range from 1GB all the way to 1TB range.

The models on their own can not be executed. For that you need a runner i.e. frontend app, that allocates the computing resources (CPU, GPU, memory) and provides the user interface (chat). There are a many applications available for this, most of them open source, which brings transparency to their operation. All of them are free to use.

So if you can trust the frontend, you should be fine. I should mention that some models used to have custom initialization code, which would be potential security threat. But I believe the frontends guard against that nowadays.

So yes, there is no need for the internet connection when running these downloaded models.

4 Likes

Perfect explanation. The only thing to add is that all them weboffers are usually large model. Like huge.

The downloadable ones need to fit the computer they‘re downloaded to. Harddisk space is not the issue, but RAM and VRAM is.

To get a smaller model, the big ones are usually distilled, it‘s a compromise to get the most out of the restricted space and computing power.

So there‘s two directions to run in this race:

  1. bigger, better, smarter
  2. smaller, good enough, efficient
4 Likes

Very good addition about the model sizes and what to expect.

It seems, unfortunately, that size matters for the models. The smaller ones seem to be noticeably dumber, with the current tech anyways.

3 Likes

snigger
 said the actress to the bishop :roll_eyes:

But thanks heaps guys. This makes me more inclined to get some hands on and learn as much as I can
 Start small and take it from there.

What would anyone recommend as a frontend app? My preference is open source and especially one where I have control over what ports it uses. I know enough about Wireshark that makes it easy for me to then see if it is phoning home
 and know enough guys and gals that can then help me figure out what and to who. Some of them even do house calls!

2 Likes

LM Studio is a good start, kinda turn-key if you will. But can do everything.

Ollama if you later want it to run in a background process, with the application using it as a (local) network service. Think your IDE is checking or writing code for you. Or your email client is improving text snippets for you.

But LM Studio is best for getting in touch without too much hassle. They also have „staff picked“ recommendations for models.

4 Likes

I was watching Overkill’s exercise in using Chat GPT as a copilot, then roll playing ground controller, and wondered if hosting it locally would provide shorter response time. I also think that using a more detailed prompt would have saved some confusion. Interesting none the less.

3 Likes

I feel like Ollama is best if you’re running it on a different physical box
 ideally in a container on a decent hypervisor
 and if that sounds like gibberish then definitely try LM Studio!

On the subject of AI crew for a flight sim, I feel like that is something that could run nicely on the PC as part of the game if they train a specific AI to do it! Which ties into this idea of big server farms running complex models to train simpler models without a human having to spend time doing it all.

1 Like

I always thought that these guys were ‘off the planet’, but I didn’t mean that literally
 until now.

Sam Altman has said on the Huge Conversations podcast that the opportunities offered to Gen Z will be so highly coveted, the early-career jobs of generations past will look “boring” by comparison.

Apparently one such opportunity could be to “explore the solar system on a spaceship in some completely new, exciting, super well-paid, super interesting job”.

Seriously, you can’t make this shite up :roll_eyes:

2 Likes

Cool use of AI in MSFS in that video! I wonder if the ChatGPT has access to the game video or just the voice chat. One possible reason for the delays is the speech recognition and synthesis.

I have been running some experiments integrating a LLM into DCS scripting environment. It is built on top of my dcs-jupyter live connection. It is still much experimental and available currently in a feature branch. Here are some readme examples of what it can do:

Example commands:

    "Spawn an F-16C at Batumi airbase"
    "Create an A-10C at altitude 1000 meters over Kutaisi"
    "Launch a group of F/A-18C aircraft at Senaki-Kolkhi"

Basically you can ask it to modify live mission environment by adding units, getting info, etc. Currently it is text chat only.

Edit: Regarding the Overkill’s MSFS AI companion video, I think we could create similar ‘copilot’ for DCS and give it access to the aircraft systems. Then have it read some manuals and it might make a good tutor. I wish I had more time at hand for experimenting.

3 Likes

Maybe now the cloak of awe that’s surrounded tech-bros is gone. They are idiots with power. Ten years ago everyone around me, men anyway, were shouting “Elon For President!” I am stupid in almost every regard but on this I feel proven right. They are nothing more than man-children who can code. In a sane world they’d be upper-middle-class, powerless, respected-not-worshiped. In our world, what they say, no matter how vacuous or far-fetched, is taken as gospel.

4 Likes

Reading your post brought to mind this humorous scene in The Survivors with Walter Mathau and Robin Williams.

Since humor is highly subjective and due to some language.

I give you Guns & Pancakes. :upside_down_face:

Summary

https://youtu.be/KZVsvCT8YIA&t=45

Wheels

2 Likes

The more I drop down this rabbit hole of researching ‘AI’ to try to understand it better, the more and more it is starting to look like plots for ‘Black Mirror’ episodes:

4 Likes

I hate to sound like an anti-vaxxer railing against “big pharma”, but this really does sound like what people insist the “microchips” were meant to do except there’s a believable mechanism for the therapy to accomplish it


Why do we entrust so much power in the hands of so few people?

2 Likes

Because where money is is where the truth lies. Our trust is irrelevant. They decide what is sensible and what is moral. Our only responsibility in this process is to shut up and consume.

2 Likes

Servitors incoming


2 Likes

3 Likes

I would like to share a transcript of ChatGPT chat from yesterday.

The nights are getting darker (here up in the north) and I got an idea to try planetary photography. I have tried it in the past with bad results, so I figured to ask advice from the AI. I think it does a superb job helping me to achieve as good results as possible with my (el cheapo) equipment.

Here’s the chat:

It wasn’t the weight, it was updating it. Having it all on a ipad/pc is way cheaper. We used to have massive book cases in the shops and hangar or microfiche images, those are long gone now that everyone gets the manuals in digital form. Saves a ton of management on how to keep everything on the correct revision.