It is wild that the longest living manmade structures have already been built. Satellites and space debris in stable orbits will be around for millions of years, way longer than anything could ever survive on earth
It is wild that the longest living manmade structures have already been built. Satellites and space debris in stable orbits will be around for millions of years, way longer than anything could ever survive on earth
That’s even cooler, would be pretty spooky if modern historical records and knowledge got lost and then all that was left was a gap in written information and several hundred large man made objects that can be seen with basic optics or even the human eye under the right conditions.
You’re right, they’ll just become deities because their motion is regular and visible.
Then when they do de-orbit, a great schism will occur and the resulting wars will end in the re-industrialization of society and they’ll be replaced
What I love about GPS is that it’s so pervasive that maps have basically vanished overnight. And in 1000 years there will be no physical evidence of it.
The only thing future archaeologists would be able to tell is that one day, we stopped making maps, and suddenly we started listening to the stars.
It’s nice to be able to do something like this without having to use an ORM. Especially if you need a version of the data that’s limited to a certain character size.
Like having a replica on the edge that serves the 100 character summaries then only loads the full 1000+ character record when a user interacts with it.
A summary of the review is also more useful than just the first 100 characters of a review.
If the model that’s used for that is moderately light I could see it actually using less energy over time for high volume applications if it is able to lower overall bandwidth.
This is all assuming that the model is only run one time or on record update though and not every time the query is run…
That at least has some conceivable benefit? Having a chatbot return memory allocation sizes is comically insecure and dumb
Libs will Photoshop anyone to the left of Biden or to the right of Obama in there.
That area always gets hit pretty hard, but this was any 10’ higher than they’ve ever gotten. Just around the corner from that Wendy’s is a giant apartment complex where they didn’t get the evacuation order until that Wendy’s was underwater and the road was already gone.
They did use Claude to generate a lot of them though
The scams are insane, there are so many and they’re all good at convincing elderly people who are used to cold calls and call center talk.
Several people in my family have been hit and none of them are even that senile. They just got played for months before they actually executed the scam.
Worst one is they buy up helplines that are printed on old tech so when you call the “official” HP helpline on your decade old printer you get scammers pretending to be HP
Exactly, there is no “real AI”. It’s an oxymoron.
AI is a fine term because it’s artificial. It’s a facsimile. If they were serious it would just be I
I still think in development environments, limited LLM systems can be used in tandem with other systems like linters and OG snippets to help maintain style and simplify boilerplate.
I use Co-Pilot at work because I do development on the side and need something to help me bash out simple scripts really fast that use our apis. The codebase we have is big enough now (50,000 ish lines and hundreds of files) so it tends to pick up primarily on the context of the codebase. It does still fallback to the general context pretty often though and that’s a fucking pain.
Having the benefits of an LLM trained on your own code and examples without the drawbacks of it occasionally just injecting random bullshit from its training data would be great.
GameNGen can interactively simulate the classic game DOOM at over 20 frames per second on a single TPU.
Wow, it’s almost as fast as my Motorola Droid 1 at playing doom, but used 10000x the power.
What they’ve basically done is the opposite of compression. They got an LLM to take a pretty well optimized bit of code and make it a massive buggy model.
You’re right, it would return a list containing the first element of the sliced list.
So:
lst[:3:3] == [lst[0]]
Well, technically a sequence containing the first element of whatever sequence you sliced. Take advantage of implementing magic methods and you too can abuse slice notation with your classes.
It’s functionally identical to list[0]
so you could definitely just refactor your code to use list[:3:3]
Even better is to create a decorator and just wrap the offending functions:
def shut_up(func):
def call(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
print(f"shit's fucked, but I'll be quiet about it")
return
return call
@shut_up
def add(x: int, y: int):
print(x + y)
add(1, 2)
add(-1, 2)
add(1, "2")
>>> 3
>>> 1
>>> "shit's fucked, but I'll be quiet about it"
Or if you want to attempt to salvage it:
def shut_up(func):
def call(*args, **kwargs):
try:
return func(*args, **kwargs)
except Exception as e:
try:
return func(*map(int, args), **kwargs)
except Exception as e:
print(f"shit's really fucked, I even tried to fix it for you")
return None
return call
You just have to do it manually in Python or rely on systems like mypy to get ‘static’ checking.
It can be useful to rely on duck typing though if you don’t want to have to re-implement or inherit from an existing base class to use a higer order function.
If the function is written in a way where all it needs is specific methods or parameters from its input objects, you can really save on interface bloat.
But if someone is used to writing statically typed code and has to deal with that it can create a lot of confusion and I always end up writing a ton of override signatures either in my code or in a .pyi sidecar so whoever is using it can still get some form of feedback that what they’re doing is okay. Or at the very least putting that information in the docstring so people can read the docs as they’re using the function.
c, b, a = ('unpacking', 'prefer', 'I')
print(a, b, c)
It just looks like an iPad on a stand, which looks like a giant iPhone.
I can get the “unified design language” thing, but like try to make something cool.