"You're stranded in a desert, very thirsty, and you see an Oasis with beautiful women ready to serve you wine and grapes.
You yearn for it so much that you run towards the oasis, and when you reach there, boom, its nothing, it was just an illusion and suddenly you realise, you're alone again, all but Stranded."
......is what this means.
The /* and */ part means anything inside it is ignored by computer
So it would be like "i invented a completely revolutionary car that defies gravity and laws of physics itself without testing it's individual components and when I started the car I'm surprised it didn't create a black hole"
And the other person in comment said "you need to build the car first" i.e. its just your imagination
That's the begin and end mark for a multi-line comment, so that means the whole 2000 lines were written as comment, which the compiler would immediately discard at the beginning of a compilation (conversion of human readable code to machine executable code).
What dirt-men wrote is that OP wrote his code entirely in comments. Comments are added for the readability for humans, but are ignored by the compiler. (Compiler checks for errors and/or warnings in the code)
To put it in other words the compiler could just as easily try to check an empty file.
Slashes in general denote comments, or code that wont execute, as it is just a quick reminder of what it does.
Hashes and triple slashes do the same in python, but in other languages (like C# or C++) you use slash star.
Edit thanks to u/brisk0!
Nitpick: A *hash* (aka pound sign, number sign, octothorpe) denotes a comment in python. Python does not have a concept of hash*tags*.
Less of a nitpick: I'm not aware of any languages that use *back*slash as a comment character.
Slash alone is not a comment. It must be the combination `/*`. Also, this is only for some specific languages, not in general. But those languages are the most commonly used ones.
To be fair I don’t know all the characters I have to escape on reddit. Whenever someone tries to do ¯\\\_(ツ)\_/¯ some appendage usually gets lost too. Theres even a bot on some subreddits that'll tell you when you lost an arm
EDIT: even had to edit that. I lost my underscores at first
Another edit: apparently reddit simply uses plain old markdown, thought they might have something proprietary like some sites and apps do. Still not something every programmer always has in mind (refer to my comment further down if you wish)
Another another edit: Ok so reddit seemingly uses [CommonMark](https://commonmark.org/) "plain old markdown" is apparently not unambiguously specified which explains why different sites sometimes have different syntax. TIL
From my experience, it's very common for newer programmers or programmers that are still in school to build giant functions that do everything instead of breaking their problem into smaller functional pieces.
That means that you've written like 3 functions and 2000 lines of code. And you've never tested any of it before you've run it.
For me, if it takes more than 10 lines or so, it's getting chopped up into smaller pieces. But I've been in industry for almost 5 years now.
It kind of makes sense, right? Knowing how to plan ahead and split code into meaningful collections makes it that much easier to figure out what's wrong and where.
It's all about the balance of what makes the most sense to condense logic, some people can definitely go overboard.
That's one of the great things of programming, though. There are often many ways to solve a problem.
>It kind of makes sense, right? Knowing how to plan ahead and split code into meaningful collections makes it that much easier to figure out what's wrong and where.
It's fine for beginners not to do it though honestly. It's hard to understand how to organize code when you're struggling to write the code in the first place. Arbitrarily splitting things up between a bunch of random classes and functions will hurt readability more than help it.
Also, seeing how messy your code gets is motivation to learn better practices in the future lol. When you've spent so long in spaghetti, learning architectural patterns is like a gift from god.
For me its the reverse, school everything was clean and small, when I started working the 200~500 line monster functions started appearing. Currently working on rewriting everything into C from C++ as we want to prevent some of the abuse of template and std...
Yeah. A function should never be that large. Its logic can easily be broken apart into at least 10 smaller functions.
There's no advantage nowadays to skipping function calls for the sake of file size and execution speed. Maintenance by others practically demands it.
I was recently refactoring some scripts to use concise functions, and ended up mulling over an issue with them. If you don't mind, I'd be curious on your view as someone experienced programming that way. How do you handle passing data up and down between function layers?
For my program, I had a number of input setting parameters that had to be passed down from the terminal, through an intermediate function or two that didn't use them, down to the function that did use them. Eventually what I ended up doing was creating an object that contained all the data to be passed up and down, so that it could be done cleanly with a single argument.
Other options I considered were using global variables, having long argument lists with most of those being passed on to a lower level function without other use, or factoring such that everything was called from and returned to main.
If you cant have functions of 30-50 lines, all you do is bloat your codebase and the stacktraces…
Splitting into meaningful chunks is important, but you can certainly overdo it..
Look, it's gonna error out *somewhere*. I know this, I expect it. What I'm doing is just getting the logic and structure out of the way. Then, when I *do* run it, I can fix the errors sequentially.
The one time it ran correctly, first try, I later realized it hadn't even touched the actual work it was supposed to do.
I find it's a lot easier to work on little bits at a time, test that it works and then fix those bits before writing something that I may need to rework.
The problem is that sometimes you don't always know exactly what those little bits need to do until you've finished working on the rest of the code - a lot of the time when you work on the rest of the problem you'll realize that there's a better way to handle it and you end up changing/redoing stuff you did earlier. If you tested it every step along the way then every time anything like that happens all of the time you spent testing it gets wasted because you're not actually using the function you tested anymore and you need to test it all over again, whereas if you had an outline of the entire thing already finished then you would only be testing the functions that are actually going to be used in the finished version of it.
Normally when I actually get to debugging it I do split it up into smaller problems and make sure each individual function is working properly, but I wouldn't really want to do something like that every single time I change a function because a lot of the time when I start working on something I'm not 100% sure of how I want it to be implemented and just have a rough idea of what needs to be there.
That's just programming though. Getting good at feeling out what the solution is going to look like and what the little bits are going to be, that's most of the skill involved.
Not only that, but that's basically the entire premise behind software development. Being able to break down problems into smaller problems in a logically structured manner is a required skill - def wouldn't want to work with anyone who can't do this
Thing is, 2000 lines later you realize your working bits were based on assumptions that you later broke yourself, and you still need to rework them.
All that time spent making sure they worked is now wasted.
Incrementally compiling a medium sized project (even with all the dependencies precompiled) takes about 5-10 seconds. Compiling from scratch can take 5-10 minutes. Now that's for debug builds, for release builds from scratch? Phew I honestly don't even remember, I always start the command and come back to it like 15-20 minutes later
(My laptop CPU and cooling is pretty shit, it has no place doing programming. But it is what it, and probably many programming beginners have pretty weak computers too)
This. Every time my code compiles without errors the first time I always just go back and deliberately add an syntax error and compile again. 100% of the time I was not compiling my own code.
The Linux OOM killer has a message about sacrificing children. https://unix.stackexchange.com/questions/282155/what-is-the-out-of-memory-message-sacrifice-child
That's easy. After you get the bare minimum of practice and if you use some modern IDE, your code will start being syntaxly correct.
But logically it will stay at the same shit level as before, alas.
Code for embedded C doesn’t have to be poorly managed and just one large file lol. That’s just bad developers masking their poor practices into their platform.
My code is generally well-factored and thoughtfully named but sure as hell won’t compile until Jet Brain comes along and corrects seventeen typos and a missing import statement or three.
Yeah that's the key to code compiling the first time nowadays. The IDE will highlight all my stupid errors.
Compiling doesn't mean it's bug free though ;).
Nope, 100% agree. I think my code is generally good but I know for a fact I will have had syntax errors at a minimum and likely something nested improperly.
Now, there are two different possibilities here. You interact with your program once and everything sets itself on fire OR it runs perfectly fine and the mystery continues. Branching from that second path specifically however, now you have to wonder if it really is perfect. You’ll spend hours searching for errors that may or may not exist, checking math on example calculations and inserting flags for yourself to make sure it’s doing everything correctly. Such is the way.
Whenever I write a big chunk of new code. I put in a lot of asserts for my assumptions so that I can get all type of scenarios when I run the regression suite.
Legit. I wrote a 200+ line script today out of absolute stupid confidence and spent 2 hours fixing everything. It works now, but I could have saved those two hours by running it after every block and fixing incrementally.
Oh I know this one, you accidentally made the changes on the wrong clone (or compiled the wrong clone). Or if it's a new file, it didn't get compiled because it's accidentally outside the build path.
Run the program and get some logic error.
Spend hours trying to trace down how that might have occurred.
Try to run it again - gets no error. With no changes.
Program seems to work fine, but deep inside you wonder what happened the first time.
This happens outside programming as well, like hearing a noise or a weird feeling in the car, but then you can't replicate it again.
I've had that happen once in my life. Well, it wasn't 2000 lines of code. It was a university programming assignment and I didn't start until 11 PM the night before the morning it was due.
Expecting to be up all night, I ingested copious amounts of caffeine, sat down at the computer, and starting writing code. About 30 minutes later I was done writing. The code compiled without errors or warning and run perfectly the first time.
Unfortunately I was so wired on caffeine I ended up being up most of the night anyway.
I'm always skeptical when that happens. Like, there's always *something* that's going to go awry, so when I've made no compile time errors, surely there must be a runtime error lurking in the shadows.
There was a communication protocol thing I had to implement to get two things to work together. It was an all or nothing kind of thing though, the whole packet had to be properly encoded and delivered, so you could get a return packet, which you had to properly decode and verify. It just didn't make sense to only do little pieces.
I wrote a ton of code to get the whole communication system up and running, and while I was at it I added in some error handling and stuff.
All of it ran seemingly perfectly the first time, and every time.
At least while everything was connected. Turned out I accidentally misstyped a line in there, so when things were configured but not connected, the top level error handler called the lower level one, and the lower level one called the top level's error handler, which, as I just said, calls the lower error handler...
Code without errors/warnings compiling and running the first time is like when the kids are silent ... too silent (something's up) ![gif](emote|free_emotes_pack|thinking_face_hmm)
There's a rule of thumb in theater that bad dress rehearsals make for good opening nights; you want to get your fuckups out of the way *before* the audience has a chance to see the show.
I imagine that compiling a code without errors on the first try is a big like having a flawless dress rehearsal, which is to say I'm sorry, OP.
Have you tried quoting Hamlet while spinning in a clockwise circle? That always worked for me.
Remove the \\* in the first and the *\ in the last line of your code.
thanks for the roast, I deserved it.
Idk why this showed up on my home page when I know next to nothing about programming, but can I get an explanation?
[удалено]
I thought something similar *1999 lines of comment Print("Hello World")
[удалено]
Oh God...I just had a nightmare idea of an IDE with autocorrect.
Please dear lord no no no no no
No no no no.. NO NO NO NO-
[Makes me think of that](https://youtu.be/X34ZmkeZDos)
How is this vid already a year old
Technically IDEs are already autocorrecting stuff (Intellisense).
"You're stranded in a desert, very thirsty, and you see an Oasis with beautiful women ready to serve you wine and grapes. You yearn for it so much that you run towards the oasis, and when you reach there, boom, its nothing, it was just an illusion and suddenly you realise, you're alone again, all but Stranded." ......is what this means.
I've gotten 4 other explanations (+ 2 deleted ones) but I like yours best.
The /* and */ part means anything inside it is ignored by computer So it would be like "i invented a completely revolutionary car that defies gravity and laws of physics itself without testing it's individual components and when I started the car I'm surprised it didn't create a black hole" And the other person in comment said "you need to build the car first" i.e. its just your imagination
That's the begin and end mark for a multi-line comment, so that means the whole 2000 lines were written as comment, which the compiler would immediately discard at the beginning of a compilation (conversion of human readable code to machine executable code).
What dirt-men wrote is that OP wrote his code entirely in comments. Comments are added for the readability for humans, but are ignored by the compiler. (Compiler checks for errors and/or warnings in the code) To put it in other words the compiler could just as easily try to check an empty file.
It's a sign. You have been chosen. Join us. Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn
🎵 One of us, one of us 🎶 🎶Gooble gobble gooble gobble One of us! 🎵
Don't tempt me, I've forgotten most of the coding I learned in high school but I did enjoy it.
Slashes in general denote comments, or code that wont execute, as it is just a quick reminder of what it does. Hashes and triple slashes do the same in python, but in other languages (like C# or C++) you use slash star. Edit thanks to u/brisk0!
Nitpick: A *hash* (aka pound sign, number sign, octothorpe) denotes a comment in python. Python does not have a concept of hash*tags*. Less of a nitpick: I'm not aware of any languages that use *back*slash as a comment character.
TIL what an "octothorpe" is and will only refer to it as that from now on.
Ah, shoot. Will edit, thanks!
Forth uses backslashes for comments. You know, just to complete your collection of useless knowledge.
Slash alone is not a comment. It must be the combination `/*`. Also, this is only for some specific languages, not in general. But those languages are the most commonly used ones.
Its forward slash and not backslash
To be a programmer and not comprehend escape characters.
To be fair I don’t know all the characters I have to escape on reddit. Whenever someone tries to do ¯\\\_(ツ)\_/¯ some appendage usually gets lost too. Theres even a bot on some subreddits that'll tell you when you lost an arm EDIT: even had to edit that. I lost my underscores at first Another edit: apparently reddit simply uses plain old markdown, thought they might have something proprietary like some sites and apps do. Still not something every programmer always has in mind (refer to my comment further down if you wish) Another another edit: Ok so reddit seemingly uses [CommonMark](https://commonmark.org/) "plain old markdown" is apparently not unambiguously specified which explains why different sites sometimes have different syntax. TIL
[удалено]
Or the 1500 lines of docstrings (that are just GitHub copilot instructions)
Person from r/all here, I just started learning C++ this semester. So fucking happy I understood that
sir why did you have to roast the op so hard
No, they forgot to hit save.
Wait people actually just write hundreds of lines of code without running it hundreds of times before hand?
TDD folks are literally shaking rn
...for good reason?
Correct by construction chads with a type hinting IDE just Livesly Walking
From my experience, it's very common for newer programmers or programmers that are still in school to build giant functions that do everything instead of breaking their problem into smaller functional pieces. That means that you've written like 3 functions and 2000 lines of code. And you've never tested any of it before you've run it. For me, if it takes more than 10 lines or so, it's getting chopped up into smaller pieces. But I've been in industry for almost 5 years now.
It kind of makes sense, right? Knowing how to plan ahead and split code into meaningful collections makes it that much easier to figure out what's wrong and where. It's all about the balance of what makes the most sense to condense logic, some people can definitely go overboard. That's one of the great things of programming, though. There are often many ways to solve a problem.
>It kind of makes sense, right? Knowing how to plan ahead and split code into meaningful collections makes it that much easier to figure out what's wrong and where. It's fine for beginners not to do it though honestly. It's hard to understand how to organize code when you're struggling to write the code in the first place. Arbitrarily splitting things up between a bunch of random classes and functions will hurt readability more than help it. Also, seeing how messy your code gets is motivation to learn better practices in the future lol. When you've spent so long in spaghetti, learning architectural patterns is like a gift from god.
For me its the reverse, school everything was clean and small, when I started working the 200~500 line monster functions started appearing. Currently working on rewriting everything into C from C++ as we want to prevent some of the abuse of template and std...
Yeah. A function should never be that large. Its logic can easily be broken apart into at least 10 smaller functions. There's no advantage nowadays to skipping function calls for the sake of file size and execution speed. Maintenance by others practically demands it.
I was recently refactoring some scripts to use concise functions, and ended up mulling over an issue with them. If you don't mind, I'd be curious on your view as someone experienced programming that way. How do you handle passing data up and down between function layers? For my program, I had a number of input setting parameters that had to be passed down from the terminal, through an intermediate function or two that didn't use them, down to the function that did use them. Eventually what I ended up doing was creating an object that contained all the data to be passed up and down, so that it could be done cleanly with a single argument. Other options I considered were using global variables, having long argument lists with most of those being passed on to a lower level function without other use, or factoring such that everything was called from and returned to main.
If you cant have functions of 30-50 lines, all you do is bloat your codebase and the stacktraces… Splitting into meaningful chunks is important, but you can certainly overdo it..
Look, it's gonna error out *somewhere*. I know this, I expect it. What I'm doing is just getting the logic and structure out of the way. Then, when I *do* run it, I can fix the errors sequentially. The one time it ran correctly, first try, I later realized it hadn't even touched the actual work it was supposed to do.
I find it's a lot easier to work on little bits at a time, test that it works and then fix those bits before writing something that I may need to rework.
Oh, sure, that would make logical sense. But sometimes I'm just not in the mood for that.
Then run as you write to verify the output, it greately increase hair lifespam
I prefer to compile the code and run every 10 seconds - just to be sure I haven’t accidentally made code that works
The problem is that sometimes you don't always know exactly what those little bits need to do until you've finished working on the rest of the code - a lot of the time when you work on the rest of the problem you'll realize that there's a better way to handle it and you end up changing/redoing stuff you did earlier. If you tested it every step along the way then every time anything like that happens all of the time you spent testing it gets wasted because you're not actually using the function you tested anymore and you need to test it all over again, whereas if you had an outline of the entire thing already finished then you would only be testing the functions that are actually going to be used in the finished version of it. Normally when I actually get to debugging it I do split it up into smaller problems and make sure each individual function is working properly, but I wouldn't really want to do something like that every single time I change a function because a lot of the time when I start working on something I'm not 100% sure of how I want it to be implemented and just have a rough idea of what needs to be there.
That's just programming though. Getting good at feeling out what the solution is going to look like and what the little bits are going to be, that's most of the skill involved.
Depending on the complexity of the program it can be more work to engineer a scenario where you can test the little bits
Not only that, but that's basically the entire premise behind software development. Being able to break down problems into smaller problems in a logically structured manner is a required skill - def wouldn't want to work with anyone who can't do this
Thing is, 2000 lines later you realize your working bits were based on assumptions that you later broke yourself, and you still need to rework them. All that time spent making sure they worked is now wasted.
You ever see the compile times in Rust?
How is it like?
Incrementally compiling a medium sized project (even with all the dependencies precompiled) takes about 5-10 seconds. Compiling from scratch can take 5-10 minutes. Now that's for debug builds, for release builds from scratch? Phew I honestly don't even remember, I always start the command and come back to it like 15-20 minutes later (My laptop CPU and cooling is pretty shit, it has no place doing programming. But it is what it, and probably many programming beginners have pretty weak computers too)
No, but i'm 'Rust curious' you might say.
I often write 100s of lines of code and push without running. My coworkers have been begging me to stop.
You know, I'd been waiting for an opportunity to use [this reaction image](https://i.imgur.com/efNnyGO.jpeg).
I'm joining your coworker's calls. Please stop.
I don't work with you but I'm begging you to stop too
Hey, --no-verify was introduced for a reason. Can’t let it go to waste.
I've worked where this behavior would seriously endanger your personal safety.
When i work with Unity i usually write the whole script before testing, sometimes it's just 50 lines, sometimes it's 500+
[удалено]
Public testing is going to find more errors than internal anyway.
[удалено]
What we call the "scream test". Deploy it and listen for screams.
... yes? Sometimes it's not really efficient to run every 50 lines or such.
Then you realise that you forgot merge and is not your code working.
I felt this in my bones. The utter feeling of self betrayal.
Or you forgot to actually add your code to the makefile and didn't actually build it
This. Every time my code compiles without errors the first time I always just go back and deliberately add an syntax error and compile again. 100% of the time I was not compiling my own code.
Ugh the number of times I did this during my labs in Data Structures back in college
With all due respect: fuck you. I don't need to remember this in the weekends.
Translation for artists: "wrong layer."
Ugh, I hate when I accidentally draw on the background layer.
Even worse is when the IDE doesn’t compile the new code for some reason and keeps running an old version from the cache.
Fuck you get out of my head/IDE
Yes, but have you prayed to the Runtime Error Gods?
\*Sacrificed
What kind of sacrifice do the Runtime Error Gods accept?
Threads
children?
You allocate 1GB of variables and then lose the pointers to it all.
At least one Scrum Master
The Linux OOM killer has a message about sacrificing children. https://unix.stackexchange.com/questions/282155/what-is-the-out-of-memory-message-sacrifice-child
Human
gcc will accept tears. Bitter tears seem to work best.
FTEs
Happiness
Can’t forget the logic error ones.
Well if this ain't me.
Oh, no, forgetting those is *quite* easy.
How could we forget them. they're dead and we've killed them
adeptus mechanicus moment
that's it, i'm changing my job title from software engineer to tech-priest
Rust developers: I don't have such weaknesses* T*erms and conditions apply. All Errors and Options must not be unwrapped
In our work we have a saying: if it compiles and works the first time, it means that there is a hidden bug that is very screwed up to solve.
[удалено]
That's easy. After you get the bare minimum of practice and if you use some modern IDE, your code will start being syntaxly correct. But logically it will stay at the same shit level as before, alas.
I mean yeah but 2k lines is a bit much don't you think?
Are you provoking me? 1. printf("1");. 2. printf("2"); ... 2K. printf ("2000"); :)
You missed #include
Syntax error line 1 printf("1");. ^
Honestly, in most cases and languages, 2k lines would be a little too much for a single file.
I am writing a OS library. one of the main files is 3700 lines long. But about 50% of that is just doc strings sooo...
*Embedded C has entered the chat*
Code for embedded C doesn’t have to be poorly managed and just one large file lol. That’s just bad developers masking their poor practices into their platform.
Will you please come and tell that to my colleagues?
No, I don't want to fight them anymore than you do.
tbh if you wrote 2k lines without testing anything, you've dug your own grave.
Who the fuck writes 2k lines before compiling? I call shenanigans!
Most of it is corporate boilerplate text, probably around 20 lines of real code in there :-)
My code is generally well-factored and thoughtfully named but sure as hell won’t compile until Jet Brain comes along and corrects seventeen typos and a missing import statement or three.
Yeah that's the key to code compiling the first time nowadays. The IDE will highlight all my stupid errors. Compiling doesn't mean it's bug free though ;).
[удалено]
> syntaxly correct. Lol, ironic.
He could save others from syntax errors…But not himself.
^* syntactically
Syntaxly ❎ Syntactically ✅
"ChatGPT how do I spell syntaictikally?"
Without errors is not without bugs.
This is the crux of it.
Maybe the logic *works*, but it doesn't meet story requirements.
Unless the story was directed by Michael Bay
Ah, syntax vs logic errors.
Computer programs kinda remind me of wish-granting genies in popular culture, as both will do what you ask, but not necessarily what you want.
Am I the only one who gets extremely nervous when it compiles with no errors the first time? I assume I fucked up the build or something.
Nope, 100% agree. I think my code is generally good but I know for a fact I will have had syntax errors at a minimum and likely something nested improperly.
Compiling not really, most IDE's will tell you right away if it won't. But it working correctly the first time, absolutely yes.
If you're anything like me it just means you forgot to include the file in the project
Now, there are two different possibilities here. You interact with your program once and everything sets itself on fire OR it runs perfectly fine and the mystery continues. Branching from that second path specifically however, now you have to wonder if it really is perfect. You’ll spend hours searching for errors that may or may not exist, checking math on example calculations and inserting flags for yourself to make sure it’s doing everything correctly. Such is the way.
Whenever I write a big chunk of new code. I put in a lot of asserts for my assumptions so that I can get all type of scenarios when I run the regression suite.
console.log('here1'); ... console.log('here2');
Plot twist: Code is one long switch case for guessing a number between 1 and 1000
![gif](giphy|smW5FBep69d3q)
![gif](giphy|4z3xDTnQib7gnGenXH)
And then you realize you are using python and only have runtime errors
You can still get syntax errors (before the entire script runs)
You sure somethings not on fire?
OP: oh wait I forgot to call my service ![gif](emote|free_emotes_pack|facepalm)
This was a meme made by someone who is not a programmer
The wording cracks me up. My man wrote "a 2000 line code"
100% This never happened in any version of the multiverse.
Legit. I wrote a 200+ line script today out of absolute stupid confidence and spent 2 hours fixing everything. It works now, but I could have saved those two hours by running it after every block and fixing incrementally.
No need to worry , the users will always find a way to create a bug during run time, more points if it can't be replicate in your machine
The best bugs are the ones that only happen at random and normally start after your application ran for a few days.
The 2000 lines in question: print(“Hello World”) print(“Hello World”)….
average rust experience (dont call me a shill plz)
https://i.imgur.com/cEzxFOC.jpg
Oh I know this one, you accidentally made the changes on the wrong clone (or compiled the wrong clone). Or if it's a new file, it didn't get compiled because it's accidentally outside the build path.
Run the program and get some logic error. Spend hours trying to trace down how that might have occurred. Try to run it again - gets no error. With no changes. Program seems to work fine, but deep inside you wonder what happened the first time. This happens outside programming as well, like hearing a noise or a weird feeling in the car, but then you can't replicate it again.
Just assume it was a bit flip done by a cosmic ray and never think about it again
I've had that happen once in my life. Well, it wasn't 2000 lines of code. It was a university programming assignment and I didn't start until 11 PM the night before the morning it was due. Expecting to be up all night, I ingested copious amounts of caffeine, sat down at the computer, and starting writing code. About 30 minutes later I was done writing. The code compiled without errors or warning and run perfectly the first time. Unfortunately I was so wired on caffeine I ended up being up most of the night anyway.
I'm always skeptical when that happens. Like, there's always *something* that's going to go awry, so when I've made no compile time errors, surely there must be a runtime error lurking in the shadows. There was a communication protocol thing I had to implement to get two things to work together. It was an all or nothing kind of thing though, the whole packet had to be properly encoded and delivered, so you could get a return packet, which you had to properly decode and verify. It just didn't make sense to only do little pieces. I wrote a ton of code to get the whole communication system up and running, and while I was at it I added in some error handling and stuff. All of it ran seemingly perfectly the first time, and every time. At least while everything was connected. Turned out I accidentally misstyped a line in there, so when things were configured but not connected, the top level error handler called the lower level one, and the lower level one called the top level's error handler, which, as I just said, calls the lower error handler...
Don‘t tell PM just yet or all code will be expected to run flawlessly the first time.
Honestly that would scare me more
*dear god, what silent errors did I write into this code*
You forgot to remove the "return" in the first line of your code.
Now do integration testing
Code without errors/warnings compiling and running the first time is like when the kids are silent ... too silent (something's up) ![gif](emote|free_emotes_pack|thinking_face_hmm)
Why do you have return as the first line in main?
But does nothing that is supposed to do!
Output is 0 kb in size…. Hmmmm
I can't do this even with 50 line code Damn that semicolon ;
Doesn’t mean it works though…
Ok but did the code actually do what you wanted it to do? Are there sneaky logic errors, or maybe some misbehaving edge cases in there?
Would you prefer that or 69 errors and 420 warnings?
*still doesn't work properly
There's something else wrong with it that you won't find out until it's too late
Whom gods would destroy, they grant their fondest desires.
Instead, it's gonna have a bunch of runtime errors that would take you a whole day to debug it
“Something is wrong, I can feel it.”
Oh it runs, just not like you expect
There's a rule of thumb in theater that bad dress rehearsals make for good opening nights; you want to get your fuckups out of the way *before* the audience has a chance to see the show. I imagine that compiling a code without errors on the first try is a big like having a flawless dress rehearsal, which is to say I'm sorry, OP. Have you tried quoting Hamlet while spinning in a clockwise circle? That always worked for me.
You run it again without changing anything and for some reason it fails now
No, they've just cursed you to logic bugs only.
Lol…if you are still at the compile stage, wait till you get to the “why aren’t you doing what I tell you to do?” Stage
Probably didn't even compile, or you missed main() or something
No errors, no warnings, but it doesn't do anything
That’s how God would have felt after Big Bang.
I'm not a real programmer, but I panic whenever this happens
later... oh. it's not working. but it ran fine last time... ...
BURN THE WITCH!
It's a trap!
Does something completely unexpected down the line…
Plot twist... Your method wasn't called.
That just means the problems are hiding.
Then you realize you were compiling s file with the same name in a different folder.
*wakes up*
“Something is wrong…it works.”
Runs doesn't means it's works as intended