How do I handle an input thats more than the char size in c?
mo_ztt ✅ @ mo_ztt @lemmy.world Posts 40Comments 673Joined 2 yr. ago

- In almost all cases you want to be using some encapsulated string class (glib string, C++ string, something like that). The reason is that your question honestly doesn't really have a good answer. I.e., if you're storing the name in a statically-allocated char buffer, someone has a name that's more than 50 characters, you're screwed. "Screwed" can include all kinds of things up to and include crashing your program or introducing a way for people to enter a malicious name and take over your program.
- If you're really set on doing this, e.g. you just want to do this learn about C memory management, then probably what you want is a dynamic buffer. Find out the length of the name before you allocate the place for it, use malloc() to get a buffer of the size you actually need, and put the string there. It's highly unusual that in a modern application, doing this type of thing yourself is worth the effort and risk it creates though.
Unsmatrixafe
The mushroom world is a giveaway for it being a Mario game... I want to say SMB3? Lava stage from the end world, mushroom level, ???, ice level, little/big level, desert level, level in the sky, Bowser's castle? But the forest image doesn't fit, and what I'm calling the cloud level seems indicative of something else.
Short answer: Unsafe
Long answer: UNSAAAAAAAAAAAAAAAFFFFFFFFFEEEEEEEEEEFE
Holy crap - I read that whole brief summary of the Justice League lore without even connecting it to Lyta Hall and that whole story. You're right, that's a masterpiece. (And, an instructive example of how to make your own excellent stuff without insisting on your own special-snowflake-mode incompatibility with everyone else's stuff.)
Oooh... dude, that's so cool. So that's the 1940 Sandman. Neat.
I second this. Wordpress has by far the best ratio of "quality of the result + number of places you can get fully-supported turnkey hosting for it" to "amount of work and know-how you have to put into it" of anything I'm aware of. It has a huge supporting ecosystem of products that want your money in exchange for some particular trivial feature to add to your site, but the core is free and open source and privacy-respecting, and honestly quite well-engineered if you can agree to close your eyes and pretend it's not made with PHP.
Tigertech and DigitalOcean are two quite good, although somewhat techie-focused, places you can get hosting for cheap if you don't want to self-host.
Interesting, how so? I recognized some of the characters (Brute and Glob) and set pieces, but I didn't think of it all as a coherent story, and I don't know enough about the pre-Gaiman stuff to recognize anything that was a callback.
I definitely remember there being something weird and hallucinatory about the 1974 comic. Superman is pretty straightforward; he can fly and he's really strong, and he beats up bad guys. It's not complicated. Sandman had friends who were hallucinations and traveled around dreams with a gas mask and a sort of magic submarine, solving problems. I definitely remember something otherworldly about it that made it stick in my mind, even having no idea what any of it was about just from the one issue.
Iran has been trying to evade sanctions and continue selling its oil abroad, while the U.S. and its allies have been seizing cargoes since 2019 after the country’s nuclear deal allowing the trade collapsed.
You misspelled "since 2019 when Trump blew up the only good thing that the US has done in the middle east for God knows how long."
I've had plenty of unpopular-opinion arguments here (example) and I've literally never had a post or comment removed that I know of. On reddit it happened several times from "both sides" of the traditional spectrum. I.e. the mods here are significantly less ban-happy than they were on reddit (although you'll definitely get downvoted for saying certain things depending on which community / which instance). Also, because of the smaller size and the UI differences it actually doesn't make too much difference if you get downvoted -- your stuff won't get hidden; it'll just get a sort of scarlet letter of WE DON'T LIKE THIS MAN SAYING THIS, but everyone can still read it.
The dev's explanation, in full, is:
The precompiled implementation is the only supported way to use the macros that are published in serde_derive.
If there is implementation work needed in some build tools to accommodate it, someone should feel free to do that work (as I have done for Buck and Bazel, which are tools I use and contribute significantly to) or publish your own fork of the source code under a different name.
Separately, regarding the commentary above about security, the best path forward would be for one of the people who cares about this to invest in a Cargo or crates.io RFC around first-class precompiled macros so that there is an approach that would suit your preferences; serde_derive would adopt that when available.
Not "Here's why I'm doing this, it might seem weird but there's a good reason" or anything. Just, go fuck yourself, run my binary.
I smell a similar resolution to the xfree86 -> xorg transition, where the community unanimously abandons serde in favor of the fork.
For example, you can assign a float* p0 to a size_t i, then i to a float* p1 and expect that p0 == p1. Here the compiler is free to choose how to calculate i, but other than that the compiler’s behavior is predictable.
I don't think this specific example is true, but I get the broader point. Actually, "implementation defined" is maybe a better term for this class of "undefined in the language spec but still reliable" behavior, yes.
“Undefined behavior” is not “machine-dependent” code
In C, that's exactly what it is (or rather, there is some undefined-in-the-spec behavior which is machine dependent). I feel like I keep just repeating myself -- dereferencing 0 is one of those things, overflowing an int is one of those things. It can't be in the C language spec because it's machine-dependent, but it's also not "undefined" in the sense you're talking about ("clever" programming by relying on something outside the spec that's not really official or formally reliable.) The behavior you get is defined, in the manual for your OS or processor, and perfectly consistent and reliable.
I'm taking the linked author at his word that these things are termed as "undefined" in the language spec. If what you're saying is that they should be called "implementation defined" and "undefined" should mean something else, that makes 100% sense to me and I can get behind it.
The linked author seems to think that because those things exist (whatever we call them), C is flawed. I'm not sure what solution he would propose other than doing away with the whole concept of code that compiles down close to the bare metal... in which case what kernel does he want to switch to for his personal machine?
- Like a lot of lawmakers they have no concept of how the internet works. They think it's like regulating cars (i.e. of course this handful of browser manufacturers will obey the law, so passing the law will control the behavior of the browsers).
- They may think that even a ham-fisted law that doesn't match reality will give them some ability to control what non-tech-savvy citizens are able to see, and there may be some validity to that (although much less than they'd probably hope).
- I don't think this is France's government right now, but certain governments can get a lot of mileage out of laws that are obviously impossible. In Russia, it's illegal to criticize the war, which is obviously impossible to enforce -- and yet, there are a bunch of people in prison, because they criticized the war and the government decided to single them out to be punished. It can be a ridiculous and impossible law; they're still in prison.
Well... I partially agree with you. The final step in the failure-chain was the optimizer assuming that dereferencing NULL would have blown up the program, but (1) that honestly seems like a pretty defensible choice, since it's accurate 99.999% of the time (2) that's nothing to do with the language design. It's just an optimizer bug. It's in that same category as C code that's mucks around with its own stack, or single-threaded code that has to have stuff marked volatile
because of crazy pointer interactions; you just find complex problems sometimes when your language starts getting too close to machine code.
I guess where I disagree is that I don't think a NULL pointer dereference is undefined. In the spec, it is. In a running program, I think it's fair to say it should dereference 0. Like e.g. I think it's safe for an implementation of assert() to do that to abort the program, and I would be unhappy if a compiler maker said "well the behavior's undefined, so it's okay if the program just keeps going even though you dereferenced NULL to abort it."
The broader assertion that C is a badly-designed language because it has these important things undefined, I would disagree with; I think there needs to be a category of "not nailed down in the spec because it's machine-dependent," and any effort to make those things defined machine-independently would mean C wouldn't fulfill the role it's supposed to fulfill as a language.
A lot of people just like their team, and the other team is the enemy, and that's about the size of it. 😕
Why is it a bad thing if someone you don't like says something sensible?
There's a lot of natural alliance between the anti-establishment on the right and the left... that's why the establishment spends so much money and effort making propaganda, trying to make sure that the natural rage of the screwed-over gets channeled to the far right. The rage gets aimed at the left, instead of being properly directed at the people who are screwing them.
I don't feel like it's helping if someone who's a victim of that propaganda makes a good decision, and people on the left don't want to acknowledge it.
I could talk for a long time about things I don't like about C++. This type of stuff doesn't even scratch the surface lol.
Years and years ago I actually wrote up a pretty in-depth blog post talking about it, even going so far as to show that it's not even faster than the competitors once you've added in all this overbloated garbage that it calls a standard library. I wrote up a little toy implementation of some problem in C, Python, C++, and a couple other languages, and lo and behold then C one was faster by a mile and the C++ one using all the easier C++ abstractions was pretty comparable with the others and actually slower than the Perl implementation.
Right, exactly. If you're using C in this day and age, that means you want to be one step above assembly language. Saying C should attempt to emulate a particular specific architecture -- for operations as basic as signed integer add and subtract -- if you're on some weird other architecture, is counter to the whole point. From the point of view of the standard, the behavior is "undefined," but from the point of view of the programmer it's very defined; it means whatever those operations are in reality on my current architecture.
That example of the NULL pointer use in the kernel was pretty fascinating. I'd say that's another exact example of the same thing: Russ Cox apparently wants the behavior to be "defined" by the standard, but that's just not how C works or should work. The behavior is defined; the behavior is whatever the processor does when you read memory from address 0. Trying to say it should be something else just means you're wanting to use a language other than C -- which again is fine, but for writing a kernel, I think you're going to have a hard time saying that the language need to introduce an extra layer of semantics between the code author and the CPU.
Why is this even still in the library 🥲
Twenty years ago it kind of made sense. Ok it's bad, but sometimes we're just reading a local file fully under our control, maybe from old code that the source doesn't exist anymore for, it's such a core function that taking it out however badly needed will have some negative consequences.
At this point though, I feel like calling it should just play a loud, stern "NO!" over your speakers and exit the program.