- cross-posted to:
- programmerhumor@lemmy.world
- programmerhumor@lemmy.ml
- cross-posted to:
- programmerhumor@lemmy.world
- programmerhumor@lemmy.ml
Meme transcription:
Panel 1: Bilbo Baggins ponders, “After all… why should I care about the difference between int and String?
Panel 2: Bilbo Baggins is revealed to be an API developer. He continues, “JSON is always String, anyways…”
To whoever does that, I hope that there is a special place in hell where they force you to do type safe API bindings for a JSON API, and every time you use the wrong type for a value, they cave your skull in.
Sincerely, a frustrated Rust dev
“Hey, it appears to be int most of the time except that one time it has letters.”
throws keyboard in trash
Rust has perfectly fine tools to deal with such issues, namely enums. Of course that cascades through every bit of related code and is a major pain.
Sadly it doesn’t fix the bad documentation problem. I often don’t care that a field is special and either give a string or number. This is fine.
What is not fine, and which should sentence you to eternal punishment, is to not clearly document it.
Don’t you love when you publish a crate, have tested it on thousands of returned objects, only for the first issue be “field is sometimes null/other type?”. You really start questioning everything about the API, and sometimes you’d rather parse it as
serde::Value
and call it a day.API is sitting there cackling like a mad scientist in a lightning storm.
True, and also true.
This man has interacted with SAP.
The worst thing is: you can’t even put an int in a json file. Only doubles. For most people that is fine, since a double can function as a 32 bit int. But not when you are using 64 bit identifiers or timestamps.
That’s an artifact of JavaScript, not JSON. The JSON spec states that numbers are a sequence of digits with up to one decimal point. Implementations are not obligated to decode numbers as floating point. Go will happily decode into a 64-bit int, or into an arbitrary precision number.
What that means is that you cannot rely on numbers in JSON. Just use strings.
Unless you’re dealing with some insanely flexible schema, you should be able to know what kind of number (int, double, and so on) a field should contain when deserializing a number field in JSON. Using a string does not provide any benefits here unless there’s some big in your deserialzation process.
What’s the point of your schema if the receiving end is JavaScript, for example? You can convert a string to BigNumber, but you’ll get wrong data if you’re sending a number.
I’m not following your point so I think I might be misunderstanding it. If the types of numbers you want to express are literally incapable of being expressed using JSON numbers then yes, you should absolutely use string (or maybe even an object of multiple fields).
The point is that everything is expressable as JSON numbers, it’s when those numbers are read by JS there’s an issue
What makes you think so?
const bigJSON = '{"gross_gdp": 12345678901234567890}'; JSON.parse(bigJSON, (key, value, context) => { if (key === "gross_gdp") { // Ignore the value because it has already lost precision return BigInt(context.source); } return value; }); > {gross_gdp: 12345678901234567890n}
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse
Because no one is using JSON.parse directly. Do you guys even code?
Relax, it’s just JSON. If you wanted to not be stringly-typed, you’d have not used JSON.
(though to be fair, I hate it when people do bullshit types, but they got a point in that you ought to not use JSON in the first place if it matters)
As if I had a choice. Most of the time I’m only on the receiving end, not the sending end. I can’t just magically use something else when that something else doesn’t exist.
Heck, even when I’m on the sending end, I’d use JSON. Just not bullshit ones. It’s not complicated to only have static types, or having discriminant fields
You HAVE to. I am a Rust dev too and I’m telling you, if you don’t convert numbers to strings in json, browsers are going to overflow them and you will have incomprehensible bugs. Json can only be trusted when serde is used on both ends
This is understandable in that use case. But it’s not everyday that you deal with values in the range of overflows. So I mostly assumed this is fine in that use case.
Well, apart from float numbers and booleans, all other types can only be represented by a string in JSON. Date with timezone? String. BigNumber/Decimal? String. Enum? String. Everything is a string in JSON, so why bother?
I got nothing against other types. Just numbers/misleading types.
Although, enum variants shall have a label field for identification if they aren’t automatically inferable.
Well, the issue is that JSON is based on JS types, but other languages can interpret the values in different ways. For example, Rust can interpret a number as a 64 bit int, but JS will always interpret a number as a double. So you cannot rely on numbers to represent data correctly between systems you don’t control or systems written in different languages.
No problem with strings in JSON, until some smart developer you get JSONs from decides to interchangeably use String and number, and maybe a boolean (but only
false
) to show that the value is not set, and of coursenull
for a missing value that was supposed to be optional all along but go figure that it was
“1” + “1”
“11”
strings are in base two, got it
Wouldn’t the answer be “10” in that case?
yes, if I could do maths
1+1=11 means base 1
How so?
1 11 111 1111 11111 111111
That’s base 1. By convention, because it doesn’t really fit the pattern of positional number systems as far as I can tell, but it gets called that.
Oh, I get it, was reading as base 2 and confused by that. Essentially Roman numerals without all the fancy shortcuts.
Based
Who calls it that? Who even uses that enough to have given it a name? Seems completely pointless…
That’s unary.
Strings are in base whatever roman numerals are.
int(“11”)
These JSON memes got me feeing like some junior dev out there is upset because they haven’t read and understood the docs.
"true"
You guys have docs?
The code is my bible.
The schema is this SQL statement
Yes, I know the field isn’t nullable in the database. I’m asking you what you are sending me, jack——
(Directed at a colleague)
This isn’t even an issue of middle ware sometimes. It’s just… Knowing the DB. And I rather not spend time learning when you can just make docs
You guys can read?
Timing is about right for it to be a batch of newly minted CS grads getting into their first corporate jobs.
Comments? Comments? Who needs comments?
I’ll have you know all of my code is stringly typed.
All my binary code is stringy too.
CBOR for life, down with JSON.
If there are no humans in the loop, sure, like for data transfer. But for, e.g., configuration files, i’d prefer a text-based solution instead of a binary one, JSON is a nice fit.
What, no! Use TOML or something for config files.
TOML
Interesting… me likes it.
Yaml is more human readable/editable, and it’s a superset of json!
Until someone cannot tell the difference between tab and space when configuring or you miss one indentation. Seriously, whoever thinks indentation should have semantic meaning for computers should burn in hell. Indentation is for us, humans, not computers. You can write a JSON with or without indentation if you want. Also, use JSON5 to have comments and other good stuff for a config file.
Yaml is just arcane bullshit to actually write as a human. Nor is it intuitively clear how yaml serializes.
It’s entirely disingenuous because who the hell is throwing JSON into YAML without converting it? Oh wow, I changed the file extension and it still works. I’m so glad we changed to YAML!
Yaml is cancer.
What I’d like for a configuration language is a parser that can handle in-place editing while maintaining whitespace, comments, etc. That way, automatic updates don’t clobber stuff the user put there, or (alternatively) have sections of
## AUTOMATIC GENERATION DO NOT CHANGE###
.You need a parser that handles changes on its own while maintaining an internal representation. Something like XML DOM (though not necessarily that exact API). There’s a handful out there, but they’re not widespread, and not on every language.
JSON5
Is a very good idea providing much needed fixes to the JSON spec, but isn’t really what I’m getting at. Handling automatic updates in place is a software issue, and could be done on the older spec.
Hmm, maybe I am missing the point. What exactly do you mean by handling automatic updates in place? Like, the program that requires and parses the config file is watching for changes to the config file?
As an example, Klipper (for running 3d printers) can update its configuration file directly when doing certain automatic calibration processes. The z-offset for between a BLtouch bed sensor and the head, for example. If you were to save it, you might end up with something like this:
[bltouch] z_offset: 3.020 ... #*# <---------------------- SAVE_CONFIG ----------------------> #*# DO NOT EDIT THIS BLOCK OR BELOW. The contents are auto-generated. #*# [bltouch] z_offset: 2.950
Thus overriding the value that had been set before, but now you have two entries for the same thing. (IIRC, Klipper does comment out the original value, as well.)
What I’d want is an interface where you can modify in place without these silly save blocks. For example:
let conf = get_config() conf.set( 'bltouch.z_offset', 2.950 ) conf.add_comment_after( 'bltouch.z_offset', 'Automatically generated' ) conf.save_config()
Since we’re declaratively telling the library what to modify, it can maintain the AST of the original with whitespace and comments. Only the new value changes when it’s written out again, with a comment for that specific line.
Binary config formats, like the Windows Registry, almost have to use an interface like this. It’s their one advantage over text file configs, but it doesn’t have to be. We’re just too lazy to bother.
Ahh, then the modification must be done on the AST level not the in-memory representation since anyway you do it, you must retain the original.
Hell, no. If I wanted to save bytes, I’d use a binary format, or just fucking zip the JSON. Looking at a request-response pair and quickly understanding the transferred data is invaluable.
If you’re moving away from text formats, might as well use a proper serialisation tool like protobuf…
Yaml?
For the love of all things pure, holy, and just, please do not use YAML in your APIs…
Fine, and if you don’t use json in your API because of the deficiency highlighted in the meme, what format do you use in your API?
I use JSON. I have used Avro for things in Kafka but I’m not sure the benefits outweigh the negatives. Avro is much more complicated than people think and most folks don’t really have a strong desire to learn how it should be used and do stuff incorrectly. Everybody knows JSON and it works with everything though. (Example: so many people just hear that Avro schemas can be backwards compatible but have zero idea that you still need the schema that wrote the message even if you want to read it into a newer one.)
Interestingly, I take the meme as saying a dev is using the wrong types in their serialization format (using strings to store integers) which was my biggest problem with Avro. Mostly from people not using logical types or preferring to use ISO 8601 datetime strings instead of the built-in
timestamp-millis
type.
Explicit types are just laziness, you should be catching exceptions anyways.
I do. I return an error.
A string that represents types…
If a item can have different type, those label fields are actually quite useful. So I don’t see the problem
It’s the API’s job to validate it either way. As it does that job, it may as well parse the string as an integer.
deleted by creator
Or even funnier: It gets parsed in octal, which does yield a valid zip code. Good luck finding that.
Well shit, my zip code starts with a 9.
I’m not sure if you’re getting it, so I’ll explain just in case.
In computer science a few conventions have emerged on how numbers should be interpreted, depending on how they start:
- decimal (the usual system with digits from 0 to 9): no prefix
- binary (digits 0 and 1): prefix
0b
, so0b1001110
- octal (digits 0 through 7): prefix
0
, so0116
- hexadecimal (digits 0 through 9 and then A through E): prefix
0x
, so0x8E
If your zip code starts with 9, it won’t be interpreted as octal. You’re fine.
Well, you’re right. I wasn’t getting it, but I’ve also never seen any piece of software that would treat a single leading zero as octal. That’s just a recipe for disaster, and it should use
0o116
to be unambiguous(I am a software engineer, but was assuming you meant it was hardcoded to parse as octal, not some weird auto-detect)
I’ve also never seen any piece of software that would treat a single leading zero as octal
I thought JavaScript did that, but it turns out it doesn’t. I thought Java did that, but it turns out it doesn’t. Python did it until version 2.7: https://docs.python.org/2.7/library/functions.html#int. C still does it: https://en.cppreference.com/w/c/string/byte/strtol
Interesting that
strtol
in C does that. I’ve always explicitly passed in base 10 or 16, but I didn’t know it would auto-detect if you passed 0. TIL.
It’s been a long time, but I’m pretty sure C treats a leading zero as octal in source code. PHP and Node definitely do. Yes, it’s a bad convention. It’s much worse if that’s being done by a runtime function that parses user input, though. I’m pretty sure I’ve seen that somewhere in the past, but no idea where. Doesn’t seem likely to be common.
PHP and Node definitely do.
Node doesn’t.
> parseInt('077') 77
- If the input string, with leading whitespace and possible +/- signs removed, begins with 0x or 0X (a zero, followed by lowercase or uppercase X), radix is assumed to be 16 and the rest of the string is parsed as a hexadecimal number.
- If the input string begins with any other value, the radix is 10 (decimal).
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/parseInt
Who tf decided that a 0 prefix means base 8 in the first place? If a time machine was invented somehow I’m going to cap that man, after the guy that created JavaScript.
Should be like
0o777
to mimic hex0xFF
Oof.
I guess this is one of the reasons that some linters now scream if you don’t provide base when parsing numbers. But then again good luck finding it if it happens internally. Still, I feel like a ZIP should be treated as a string even if it looks like a number.
Yep. Much like we don’t treat phone numbers like a number. The rule of thumb is that if you don’t do any arithmetic with it, it is not a “number” but numeric.
Well, we don’t, but every electonic tables software out in the wild on the other hand…
/j
Yes, I know that you can force it to become text by prepending
'
to the phone, choose an appropriate format for the cells, etc, etcThe point is that this often requires meddling after the phone gets displayed as something like
3e10
I refuse to validate data that comes from the backend I specifically develop against.
json doesn’t have ints, it has Numbers, which are ieee754 floats. if you want to precisely store the full range of a 64 bit int (anything larger than 2^53 -1) then string is indeed the correct type
json doesn’t have ints, it has Numbers, which are ieee754 floats.
No. numbers in JSON have arbitrary precision. The standard only specifies that implementations may impose restrictions on the allowed values.
This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.
Let me show you what Ethan has to say about this: https://feddit.org/post/319546/174361
This is String - you’ve seen it before haven’t you, Gollum?
The comment section proves that xml is far superior to json
XML is all round better than Json.
Protocol Buffers are hated, but they are needed.
Do you actually use them?
I’m a student so, yes and no?
I do, but I also don’t think that’s a silver bullet, unfortunately. There’s convenience in code generation and compatibility, at least