Not necessarily, you still need backups or snapshots especially on home directory in case software have a nasty bug like deleting your data.
Not necessarily, you still need backups or snapshots especially on home directory in case software have a nasty bug like deleting your data.
Yup and I am getting sick of hearing this even on Arch Linux. Like, mofo, you could literally run a snapshot or backup before upgrading, don’t blame us if you’re yoloing your god damn computer. Windows have exactly the same problem too and this is why we have backups. Christ.
On my Arch Linux Install, I literally have a Pacman Hook that would forcibly run backup and verify the said backup before doing a system-wide update.
That one was an old documentation that some of the Chinese folks actually document a lot of quirks related to X11 protocol. I paid about $6000 for translator to work on translating that doc to English and I use it to build my own GUI Toolkit on Linux that I still use to this day.
How it really works:
mpf_t temperature;
It’s arbitrary sized floating precision number provided in LibGMP and you can find more information about mpf_t here.
Lol, that one way to put it. Basically a language convergence, not a bad thing to be honest.
Yeah, MLIR is more or less an “IR with Dialects”, a lot of IR language spec share a lot in common with one another, so MLIR try to standardize that similarity between IR. Because of that feature, it reduce amount of IR code that developer have to worry about and they can progressively expand the available dialects for MLIR as they develop a compiler like IREE.
Yup, been writing a new shader language to replace GLSL and HLSL for Vulkan Compute purposed, but I eventually switch from SPIR-V IR to MLIR and use IREE Compiler which accepts the MLIR and compile it to any of CUDA, ROCm, SPIR-V and so forth.
It’s one of the project that I’ve been working on to outright replace Pytorch/Tensorflow and ban those two framework from my office forever. I got fed up not knowing how much exactly do I need in memory allocation, computational cost, and so forth when running or training neural net models. Plus I want an easier way to split the model across lower-end GPU too that doesn’t rely on Nvidia-only GPU for CUDA code. I also wanted to have SPIR-V as a fallback compute kernel, because if CUDA/ROCm is too new for GPU, you’re SOL, but if you have SPIR-V, chances are, any GPU made in the last 10 years that have a Vulkan Driver, would likely be supported.
One of the biggest plus with MLIR is that you are also future proofing your code, because that code could feasibly be recompiled for new devices like Neural Net accelerator cards, ASIC, FPGA, and so forth.
Very nice, I was basically forking off Python Lark and rewriting it in C language, with some adjustments to Earley Parser in an experiment to parallelize the processing in Vulkan Compute.
I agree on avoiding on the idea of avoiding having to make your own parser generator, this is precisely what I’m doing and it’s hell. I assumed that you probably want to pick up some understanding on how parser differs when it come to writing grammars. As for ease of use and requiring the least understanding, using something like Earley parser is probably the easiest, it would be slower than other parser algorithms, but it could handle ambiguous grammars making it ideal for first timers to learn how to write a programming language.
Yep, and if open source licensing could be revoked on a whim, you can imagine the chaos that ensued. That would be my understanding as well, old version that have MPL license is perfectly fine to fork off, newer version might not be as it is under a different license. One of the reason why I liked Apache License is that it have make it explicitly clear that it’s irrevocable whereas MPL it is operating on an assumption that it’s not revocable. The most fundamental problem with the legal system in USA is that no law is “set in stone” and leaving things to assumption is open to reinterpretation by the judge who may have sided against you. (Hell, Google vs Oracle on Copyrighted API is still on case-to-case basis, so take it as you will.)
Disclaimer: I am not a lawyer. I just share what I learned from Legal Eagle youtube and few other sources.
I definitely recommends that you start learning about the LL(k), LALR, and perhaps even Earley Parser algorithms. I am assuming you have picked up a little bit on LL(1) parser and some basic lexer, so mastering the parser algorithms are basically the next stop for you.
Once you get the grasp of those things, you are well on your way to designing a programming language.
I concur, there was a few problems that might come up on various platforms like Windows not implementing C11 standard threads and other stuff, you would instead use TinyCThread library that works like a polyfill.
All problems and challenges are workable, if the problem with Debian is out of date library, you could set up CI/CD for release build that rebuild your software when update occurs and static link the updated dependencies.
Back to your point, if they didn’t design their code and architecture to be multiplatform like in C, they need to re-evaluate their design decisions.
I would spend it on language translation basically, paying someone to translate international documentations on things that aren’t documented in USA no matter where you look.
I think it’s asinine to ask the developer who contribute to your project, literally taking the time of the day writing the code and submit PR to your project, to pay money to you.
I wouldn’t even bother contributing to the project at that point.
This is not the first time it happens with Dotnet Open Source packages, there are some pretty funky things going on namely:
Imagesharp (They re-license from Apache 2 to something like Community/Commercial licenses and threw a huge fit over it)
Fody (It expects the software contributors of Fody to be a patron.)
It just rooted back to my frustration when I was trying to fill in missing implementation details on projects like Skia (at the time it lacked support for Vulkan.) My very fundamental core belief is that for core libraries like say, Skia, Neural Net Framework, and other crucial projects like that should offer a way in C API that allows every type and implementation to be extended upon by any other language that can interface with C API by providing your own VTable or whatnot.
One of the approach I do for my GUI Toolkit written in C (specifically on Linux to replace QT and GTK) was making a single inheritance object oriented programming in C.and if you insert the base class type structure at the top of your custom struct type and provide your own VTable for those objects, you can readily extend the underlying library natively in whatever programming language you use assuming it can talks to C API in a complete sense.
Let me know if you want a demonstration of this, I would be happy to find the time to set up a small sample to give you the idea on how it’s done.
And I am also aware of the criticisms on those approach, verbosity of attempting to implement object oriented programming in C is kind of absurd and the API coverage would balloon. That is largely why I work on a Compiler-Generator Framework specifically to address the challenges by allowing me to add dialects on top of C Language such as generic, object oriented programming, and various dialects. I brought C closer to C# in term of syntax and features and at the end of compilation, it still produces readable C language code output and it also generates what I called an FFI-JSON. It’s essentially a JSON file that describes all of the types used in a C project, the sizes of integers/floating points, structure types and it’s fields/offsets/sizes comments, and function declarations. It’s done in a way that you could read the JSON file and generate your programming language binding library saving you weeks of work.
I don’t think so on the extensibility aspect alone, some of the Rust syntax/trait does not map well to other languages when other languages attempts to extend Rust library. I write C code in a way that it would be extensible from any languages.
Yep, biggest reason why I chose C language is Foreign Function Interface. Code you write for C is more than likely to be usable in just about any other languages.
Sure until you can’t with flatpak. Flatpak does not safeguard against system binaries and there are always risks associated with that.
Honestly I think I am going to move on from Programming.dev, it’s filled with script kiddie like you. Good lord.
Fuck y’all. Good evening.