![](https://feddit.nu/api/v3/image_proxy?url=https%3A%2F%2Fhexbear.net%2Fpictrs%2Fimage%2F3a8f8abe-b73d-425e-8649-120df2c65e69.jpeg)
![](https://feddit.nu/api/v3/image_proxy?url=https%3A%2F%2Fhexbear.net%2Fpictrs%2Fimage%2F2c4e08c9-7864-4b30-8012-7ffb79b949f6.jpeg)
“I’ll put it like this. If Elon Musk called me and said, ‘Hey, let’s go to space,’ I’d probably consider going and probably would go because he’s got the resources,” Bloom said.
So, you’re saying there’s a chance.🤞
“I’ll put it like this. If Elon Musk called me and said, ‘Hey, let’s go to space,’ I’d probably consider going and probably would go because he’s got the resources,” Bloom said.
So, you’re saying there’s a chance.🤞
That’s pretty impressive.
https://developer.huawei.com/consumer/en/
No experience, doesn’t look like any western companies support it yet. This is their website. If the English version doesn’t pop up, there’s a drop-down at the very bottom of the page.
Here’s a course on HarmonyOS.
https://developer.huawei.com/consumer/en/training/course/video/C101639032639762016
I just setup tailscale on my home Ubuntu jellyfin server the other day for the first time and it was a totally painless experience. Been working great.🤞
Yeah, I was still young and not experienced enough with the non-NT4 yet.
RIP.
Yep, ran Irix super slow. I didn’t have anything else to run on it as this was right around when broadband came into my area., and while there were plenty of warez sites, downloading niche stuff wasn’t as easy to find. One of the coolest things about it that surprised me was the whole thing was modular. (Remember this was 20 something years ago.) There was a little lever or something on the back and you could just slide out the motherboard module and other stuff for upgrades.
Sadly, I think it ended up in the trash when I went nomadic back in 2010. Might be in a closet somewhere at my parents, but I think I remember have a long emotional struggle over putting in the dumpster.
In the early 2000s, in return for a favor, I got my hands on an SGI O2. And then did nothing with it.😅 It was too slow and I didn’t have any idea of what do with it. But I finally owned an SGI. It was a very pretty paperweight.
Ah, it’s happening; Your phone is going from personal wiretap to narc. The panopticon expands.
Maybe, but 3 years is not a lot of time build the framework for a General Strike. There’s a lot of prep that needs to be done, especially in the US.
Totally unrelated to Fain’s call for a General Strike in the future, I’m sure.
Ah, no worries. Yeah, pretty grim, and I’ve not even gotten into the horror of what they’re gonna do with our biometric data. lol.
It’s been a few years since I’ve used mturk, but there were very few VR based jobs when I last used it. Has that changed?
Yeah, I’m familiar with a bunch of autonomous vehicles/drones being trained in simulated environments, but I’m also thinking stuff like VRChat.
No, it’s not. Maybe strictly for LLMs, but they were never the endpoint. They’re more like a Frontal Lobe emulator, the rest of the “brain” still needs to be built. Conceptually, Intelligence is largely about interactions between Context and Data. We have plenty of written Data. In order to create Intelligence from that Data we’ll need to expand the Context for that Data into other sensory systems; Which we are beginning to see in the combo LLM/Video/Audio models. Companies like Boston Dynamics are already working with and collecting Audio/Video/Kinesthetic Data in the Spatial Context. Eventually researchers are going to realize (if they haven’t already) that there’s massive amounts of untapped Data being unrecorded in virtual experiences. Though I’m sure some of the delivery/ remote driver companies are already contemplating how to record their Telepresence Data to refine their models. If capitalism doesn’t implode on itself before we reach that point, the future of gig work will probably be Virtual Turks where, via VR, you’ll step into the body of a robot when it’s faced with a difficult task, complete the task, and then that recorded experience will be used to train future models. It’s sad, because under socialism there’s an incredible potential for building a society where AI/Robots and humanity live in symbiosis akin to something like The Culture, but it’s just gonna be another cyber dystopia panopticon.
I forget how to use it properly off the top of my head, but nslookup should be able to tell you what the dns lookup looks like and where the mismatch is coming from.
I keep wanting to get one of those leapster devices off eBay and install an emulator on it but I’m not sure if it’s worth it.
:utter-contempt:
If you’ve used it, I’m curious as to what potentials do you see from it?
AI, in whatever form, is a Force Multiplier. Being that we are massively outnumbered and overpowered by the foot soldiers of capital, you’d think the left would be falling over themselves trying to master tech that wildly boosts our abilities to create propaganda, spread propaganda, and perform uh () ethical penetration testing. Because the reality is that this is going to be used by right wingers to do all those things while we ironically yell bazinga or whatever and argue that it’s not really intelligence.
Personally, I’m using ollama locally to run a Mistral llm for coding and computer security related questions. You can bypass most restrictions by asking questions in the context of writing a story. I see a lot of potential in automating much of my security work as the agent/agi frameworks advance. Working with documents is pretty nifty as well and I’m looking forward to when the local models get some of those massive context windows I’ve been hearing about.
The task failed successfully.