When you bring threads into it, these exotic features make more sense. I have been doing single-threaded stuff for the most part.
When you bring threads into it, these exotic features make more sense. I have been doing single-threaded stuff for the most part.
I just never learned smart pointers and write C++ code like it’s C for aesthetic reasons.
I’ve been using C++ almost daily for the past 7 years and I haven’t found a use for shared_ptr, unique_ptr, etc. At what point does one stop being a noob?
Thanks. In my experience, Wine and Proton don’t work as well as native for one of the apps I’m building, so I will need to either build in a container or say “use X Ubuntu version”.
I do, but Linux should be a first-class platform alongside Windows.
Mainly getting builds onto platforms catering to Windows users and gamers. The consensus here seems to be using containerized build environments.
Thanks for the info! If I’m doing container builds anyways, this looks tasty.
I don’t use dependencies that don’t have a history of backwards compatibility, and when I do, I ship them. It’s SOP to assume basic things like a GUI “just work”, and it’s also SOP for Ubuntu to ship non-functional programs that were broken by GTK and Qt updates. I’d rather have buggy/broken software with undefined behavior than software that just doesn’t run.
I’ll probably have to use chroot or docker. I tried with glibc force link but when I objdump -T I see symbols that slip through with newer glibc, even when they’re .symver’d in the header. That project hasn’t been updated in a long time.
Containers aren’t too bad for storage from a developer’s perspective. I’m talking about the dependency versioning bullshit of flatpak and snap specifically for end users. I don’t know if AppImage technically counts as a container, but the whole point of it is to ship libraries the end user doesn’t have, which implies a fundamental flaw in the hierarchical dependency tree or distribution model - the end user should already have everything they need to run software.
The sad reality is that when you look at the files being requested, it’s usually scrapers looking for exploits.