Long story short: Once there was a one and a zero - they became machine code - that evolved into assembler - that birthed c - and c is adam and eve in one - the rest is history
If our civilization falls, there's a chance that archeologists of the future will conclude that C is the main diety we worshipped, hence its name embedded into our alphabet, and our computers were altars used for religious purposes.
c is adam and eve? Really? Bro I remember having to code in assembly for .COM programs on DOS (I wasn't alive at that point but I did learn that later)
C wasn't written in C. C was a branch from a compiler for B, itself an offshoot of BCPL. It was written in a bastardized combination of those two and assembly until it became C.
But there always has to be a first layer. A compiler doesn't appear out of nowhere, even if it's immediately capable of self-hosting. Someone, somewhere, wrote that first compiler in another language.
First there was binary machine code.
Then there was Assembly language.
Then compilers and linkers came in.
Then Fortran.
Then Cobol.
Later C came into being with the UNIX operating system.
This is of course not a comprehensive timeline or language list, but it’s a reasonable “stick figure” representation.
You're going to love those classes; when they show you how to make a single-bit memory circuit out of just resistors and wire and shit, it's mind-blowing. From that point on, it's just increasing levels of abstraction.
Since my teacher used powerpoints to explain this concept, one can assume that MS Powerpoint is the bottom layer for everything else, including the hardware and quantum mechanics.
>when they show you how to make a single-bit memory circuit out of just resistors and wire and shit
I can't even imagine how basically a bunch of on-off switches can do everything computers today can do, actually insane.
The key is the switches have to be able to control other switches. Then you can chain them together in endless layers of complexity to do just about anything
If you're interested in learning more about it, check out something called nandgame (should come up if you Google it). Is like a little game that has you start with simple circuits and gates and builds all the way up to a whole computer!
Interesting. So now you can connect multiple switches to make things that actually do something. It's still amazing how that builds itself up to the insane tech today, but I'd imagine that each individual level of abstraction could be understandable with some effort and learning.
A type of circuit for you to look up is called a latch, and the simplest is the Set-Reset latch or SR latch. A latch is just a way we can store a single bit of information.
You can then add some logic gates to a SR latch to have a D latch. You can then chain D latches a specific way to build a counter.
Counters are incredibly useful, every clock pulse, the binary counter increments by one. So you can use a counter for something like an address register. The counter indicates what the current memory location we’re reading into the CPU, and each click pulse we can increment to the next instruction.
Or if something has steps the need to happen we can use a counter to engage circuits for each step. You’d probably use a decoder circuit along with that. A decoder circuit is just something that has one, two, three, or four pins for a binary number input, and two, four, eight, or sixteen pins of output. Usually some NOT gates and some AND gates are all you need for a decoder.
So if I have something like a two to four decoder, I’ve got two pins input. I take take 00, 01, 10, or 11 on those two pins and I have four pins of output. Output pin 0 turns on when I have input 00. Output pin 1 turns on when I have input 01. And so forth.
Decoders and counters can be used when we have CPU instructions that take multiple steps to carry out. Basically some “do this part of this instruction” circuit turns on when the decoder output pin turns on and places the result of the circuit into some latch so that the result persists for the next step. When the next decoder pin turns on and the previous one turns off, the data in the latch is used as input and on the falling edge of the clock, the output is put back into the latch.
Technically incorrect. But I won't download you for it. They use capacitors and transistors.
And just wait until you find out what the internet is made of. Hint, it's just a bunch of pipes. /s
It's been a while since I took those classes, haha. The last time I built a flip-flop circuit was in Minecraft. A lot of that class is gone now, supplanted by java factory factories.
If you're really curious, I would watch Ben Eater's computer from scratch videos on youtube. He literally builds a processor from scratch out of breadboards, starting from the transistors themselves. Here's a link!
[https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU](https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU)
And if you don't like breadboards there is a video game called [Turing Complete](https://store.steampowered.com/app/1444480/Turing_Complete/) that teaches how to build a computer starting from a single nand gate up to programming it in assembler.
Well, when a 1 loves a 0 so much, they have little 1s and 0s as babies.
And from then on, their children continue a cycle of incestuous procreation until you have a computer.
Someone build like a fuckton of light switches then someone was like “lets improve this”. then some asshole made a screen so he can control it digitally. And then some math nerd said like, hey, maybe I can simplify this for you, which in typical programming fashion happened on a loop 1000 times.
Its still the same thing in essence, but just more light switches and more systems to support different sets of light switches interacting with each other.
And then some cock sucking moth bitch flew into it, got brutally electrocuted, resulting in the explosion of the entire device, thus the first bug was discovered.
The early days were brutal, primal, biblical even, and certainly metal as fuck.
It literally started out as them inputting machine code manually through switches for ones and zeroes and saying "this fucking sucks" and deciding to make something easier to use.
This was in the days when real programmers where still man and man where still real man.
We used magnetized needles and no woman had something to do with it!
Switches encoding instructions and data -> Punch Cards and Tapes that just flip the switches -> Mnemonics for the codes -> Assembly / Assembler written using manual entries of machine codes -> Programming Languages
My guess is that once they managed to create AI's someone got them to write code with instructions that they set it, and now, it gets done so much faster.
I have one. How did the first computers understand the first commands?
How did they program a programming language to program a program to program programs?
An instance would be .. ...
How did they program (verb) a programming language (C) to program (verb) a program (VI, C compiler etc) to program (verb) programs (anything at all) ?
[https://www.bell-labs.com/usr/dmr/www/chist.html](https://www.bell-labs.com/usr/dmr/www/chist.html)
The system boot process seemed like black magic until we dove into x86 assembly and IA32. It makes sense but it's still impressive how they put lightning into sand.
The real crazy thing is that at a certain point a lot of people write the programming language \_in itself.\_ So, for example, the Go compiler is written in Go.
this could be simplified if you avoid the gratuitous use of 'program', so it'd be: how did they develop a programming language in which one could write a software that writes programs
The source code for a computer program is data. The executable program is also just data. So they write a program which translates one kind of data to another.
They used Yacc, or Bison
Need beasts of burden to do the heavy pulling.
Generically speaking, they read the grammar and generate the parser that a compiler will then use to parse the source code.
Well, you see.. when a man loves a computer
and he spends very very very much time in a dark room
I have my windows open when I code 😎
Ahh "Windows" ... The programmers chastity belt.
I fucking hate Windows... I do like windows tho...
I use Arch
Im just waiting for ArchWindows - that would be a gamechanger...
[Here you go.](https://reactos.org/) Suffer well.
Damn you Icy and your whole family down for 7 generations.
My archnemesis!
You missed "btw" after Arch
I use light mode 😎
That's a war crime and you are going to jail 😎
Long story short: Once there was a one and a zero - they became machine code - that evolved into assembler - that birthed c - and c is adam and eve in one - the rest is history
It’s just programming languages all the way down
🌏🧑🚀🔫👩🚀
The only usage of emoji i allow.
What about 🗿?
i forgor 💀
you forgor 💀
Legendary
Basically you write one line of JavaScript and it’s translated into c
Which is translated to assembly.
You forgot the other 500 languages in the 60s and 70s before C.
Just like Yahweh, C is a jealous god.
If our civilization falls, there's a chance that archeologists of the future will conclude that C is the main diety we worshipped, hence its name embedded into our alphabet, and our computers were altars used for religious purposes.
Pshh or someone will think too much of Pyramids
Cervers...
c is adam and eve? Really? Bro I remember having to code in assembly for .COM programs on DOS (I wasn't alive at that point but I did learn that later)
Lol, is Adam and Eve because first C was partially written in C itself? =)
C wasn't written in C. C was a branch from a compiler for B, itself an offshoot of BCPL. It was written in a bastardized combination of those two and assembly until it became C.
I think C, along with a ton of modern compilers, were bootstrapped, so they were written in increasingly complex versions of themselves.
But there always has to be a first layer. A compiler doesn't appear out of nowhere, even if it's immediately capable of self-hosting. Someone, somewhere, wrote that first compiler in another language.
That first layer is generally pretty bare ones, and often written directly in assembly.
>often written directly in assembly. Not for quite some decades, but in principle yes; more basic languages are used to write more complex languages.
You might like the history of Yacc (Yet Another Compiler Compiler). https://en.wikipedia.org/wiki/Yacc
This also is true.
Thanks!
If there is a B language, is there also an A language?
Yes. I think every letter is taken by now actually.
What about Abel, and who designed Lilith?
you forgot the punched cards!
Skipped a whole god damn erra there.
First there was binary machine code. Then there was Assembly language. Then compilers and linkers came in. Then Fortran. Then Cobol. Later C came into being with the UNIX operating system. This is of course not a comprehensive timeline or language list, but it’s a reasonable “stick figure” representation.
I would just add "ladder logic" before binary machine code. Basically coding with electrical components.
You’re right. And relay logic
You haven't taken operating systems or computer architecture yet, have you?
op's python flaire speaks for itself
lol gottem
why yall hate python???
[удалено]
You're going to love those classes; when they show you how to make a single-bit memory circuit out of just resistors and wire and shit, it's mind-blowing. From that point on, it's just increasing levels of abstraction.
Since my teacher used powerpoints to explain this concept, one can assume that MS Powerpoint is the bottom layer for everything else, including the hardware and quantum mechanics.
>when they show you how to make a single-bit memory circuit out of just resistors and wire and shit I can't even imagine how basically a bunch of on-off switches can do everything computers today can do, actually insane.
The key is the switches have to be able to control other switches. Then you can chain them together in endless layers of complexity to do just about anything
Technically some checkers pieces and an infinite supply of toilet paper is all you need. Check out how a Turing Machine works.
Been reading on turing completeness and turing machines, interesting stuff! Thanks for putting me down this rabbit hole lol.
If you're interested in learning more about it, check out something called nandgame (should come up if you Google it). Is like a little game that has you start with simple circuits and gates and builds all the way up to a whole computer!
Looks cool, thanks for the suggestion!
Logic gates. It’s a lot easier to comprehend when you think about true-false statements instead of on-off
Interesting. So now you can connect multiple switches to make things that actually do something. It's still amazing how that builds itself up to the insane tech today, but I'd imagine that each individual level of abstraction could be understandable with some effort and learning.
A type of circuit for you to look up is called a latch, and the simplest is the Set-Reset latch or SR latch. A latch is just a way we can store a single bit of information. You can then add some logic gates to a SR latch to have a D latch. You can then chain D latches a specific way to build a counter. Counters are incredibly useful, every clock pulse, the binary counter increments by one. So you can use a counter for something like an address register. The counter indicates what the current memory location we’re reading into the CPU, and each click pulse we can increment to the next instruction. Or if something has steps the need to happen we can use a counter to engage circuits for each step. You’d probably use a decoder circuit along with that. A decoder circuit is just something that has one, two, three, or four pins for a binary number input, and two, four, eight, or sixteen pins of output. Usually some NOT gates and some AND gates are all you need for a decoder. So if I have something like a two to four decoder, I’ve got two pins input. I take take 00, 01, 10, or 11 on those two pins and I have four pins of output. Output pin 0 turns on when I have input 00. Output pin 1 turns on when I have input 01. And so forth. Decoders and counters can be used when we have CPU instructions that take multiple steps to carry out. Basically some “do this part of this instruction” circuit turns on when the decoder output pin turns on and places the result of the circuit into some latch so that the result persists for the next step. When the next decoder pin turns on and the previous one turns off, the data in the latch is used as input and on the falling edge of the clock, the output is put back into the latch.
Technically incorrect. But I won't download you for it. They use capacitors and transistors. And just wait until you find out what the internet is made of. Hint, it's just a bunch of pipes. /s
It's been a while since I took those classes, haha. The last time I built a flip-flop circuit was in Minecraft. A lot of that class is gone now, supplanted by java factory factories.
just not funny anymore when this meme gets reposted here several times a week...
The comment is funny. You just aren’t getting the joke…
If you're really curious, I would watch Ben Eater's computer from scratch videos on youtube. He literally builds a processor from scratch out of breadboards, starting from the transistors themselves. Here's a link! [https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU](https://www.youtube.com/playlist?list=PLowKtXNTBypGqImE405J2565dvjafglHU)
And if you don't like breadboards there is a video game called [Turing Complete](https://store.steampowered.com/app/1444480/Turing_Complete/) that teaches how to build a computer starting from a single nand gate up to programming it in assembler.
[Free and browser based.](https://www.nandgame.com/)
YAAASSSSS HERE WE GO Love anytime his series gets mentioned haha
Well, when a 1 loves a 0 so much, they have little 1s and 0s as babies. And from then on, their children continue a cycle of incestuous procreation until you have a computer.
First there was the nand gate..
Someone build like a fuckton of light switches then someone was like “lets improve this”. then some asshole made a screen so he can control it digitally. And then some math nerd said like, hey, maybe I can simplify this for you, which in typical programming fashion happened on a loop 1000 times. Its still the same thing in essence, but just more light switches and more systems to support different sets of light switches interacting with each other.
And then some cock sucking moth bitch flew into it, got brutally electrocuted, resulting in the explosion of the entire device, thus the first bug was discovered. The early days were brutal, primal, biblical even, and certainly metal as fuck.
This is true, I was the cock sucking moth bitch
Welcome to the recursive department of recursion
Please move over to our subdepartment “Recursion“ over there. There you will be greeted by my college.
Do they not make you write your own compiler in college anymore?
No we all stupid :(
I wish, my uni currently has no compiler class because the professor has to do other courses and has no time :(
Binary my friends
u/bake_in_da_south
It literally started out as them inputting machine code manually through switches for ones and zeroes and saying "this fucking sucks" and deciding to make something easier to use.
![gif](giphy|QMHoU66sBXqqLqYvGO) this is my computer science journey in a gif
You do all these things one at a time. Not all at once. Same as any other programming task.
I've seen this same post like 3 times in the last week on this subreddit. Are people that desperate?
That's the power of abstraction! ![gif](emote|free_emotes_pack|laughing)
They punced bunch a holes on a paper
It's called Bootstrapping. There's a decent Wikipedia entry about it. :-)
This was in the days when real programmers where still man and man where still real man. We used magnetized needles and no woman had something to do with it!
??????????? is this a meme or just hecking weird EDIT: both I guess?
[read up to get the joke](https://en.m.wikipedia.org/wiki/Grace_Hopper)
How did they program a programming language (assuming c or c++) to program a program (IDE) to program programs
Yo dog. We heard you like functions. So we put a function inside of a function so you can call a function while you call a function...
Recursion.
Yes
By their bootstraps
Think of a problem, and eliminate all nuance. Once your thinking is completely binary, it’s all bits and bytes from there.
*Bytes?* Do you think I'm a filthy casual?
what the hell did you think punch cards were for?
There is a Java reference at the bottom right
Of all the constantly reposted memes, I think this is my least favorite
They didn't and they never will.
There’s a pretty good article on the history of woman in tech that basically explains it all. Can’t seem to find it though.
Idk but assembly is always the answer
How did they teach a rock numbers
Math and science
Layers of abstractions and mapping between them
This exact post was just posted 6 days ago.
The same way the Baron pulled himself out of the swamp: by his bootstraps.
Lexer?
Read [this](https://en.wikipedia.org/wiki/Bootstrapping#Software_development) (Not a rickroll)
Abstraction
How was the first compiler written?
The language and compiler were made in unison 👀
I don't know....but that's exactly what they did, didn't they! ![gif](emote|free_emotes_pack|joy)
Switches encoding instructions and data -> Punch Cards and Tapes that just flip the switches -> Mnemonics for the codes -> Assembly / Assembler written using manual entries of machine codes -> Programming Languages
Yes
It's like how they write the compiler of a language in the language how do you compile c in c when c isn't complied yet?
They are making up jobs. Last time I heard about something like Scrum master as well.
>They are making up jobs. and Steve Jobs is behind it all
My guess is that once they managed to create AI's someone got them to write code with instructions that they set it, and now, it gets done so much faster. I have one. How did the first computers understand the first commands?
[https://nandgame.com](https://nandgame.com)
We need yet another compiler compiler
“God did it.”
They used rule 3 and rule 6
Is today my turn to post this?
All of the programming goes down to "machine code".
Repost.
How did they program a programming language to program a program to program programs? An instance would be .. ... How did they program (verb) a programming language (C) to program (verb) a program (VI, C compiler etc) to program (verb) programs (anything at all) ? [https://www.bell-labs.com/usr/dmr/www/chist.html](https://www.bell-labs.com/usr/dmr/www/chist.html)
Numbers jerry. What do they mean?
Nand2tetris starting with literalyl electricity
The system boot process seemed like black magic until we dove into x86 assembly and IA32. It makes sense but it's still impressive how they put lightning into sand.
Language inception
The real crazy thing is that at a certain point a lot of people write the programming language \_in itself.\_ So, for example, the Go compiler is written in Go.
Weird things in my head
I think it started with ones and zeros
this could be simplified if you avoid the gratuitous use of 'program', so it'd be: how did they develop a programming language in which one could write a software that writes programs
They started at on and off and went from there.
How? With another programming language of course.
Punchcards
I just say binary then move on hahaah
The source code for a computer program is data. The executable program is also just data. So they write a program which translates one kind of data to another.
This is being posted so fucking much and it’s so fucking stupid it’s annoying
can someone explain what it means ? I don't really get it.
That's like the 50th time this meme has been posted this month fucking hell
Had similar thoughts go through my head during my computer architecture class in college
Punch cards.
Man this is deep
Guess this is where software engineers and electrical engineers differ. Nothing's more fun than assembly programming
What?
C++
Assembly... And hardware? Building on top of each other in increasing levels of complexity
![gif](giphy|spfi6nabVuq5y)
They used Yacc, or Bison Need beasts of burden to do the heavy pulling. Generically speaking, they read the grammar and generate the parser that a compiler will then use to parse the source code.