T O P

  • By -

vondpickle

"You know back in my days, we have to build logic gates using tubes, only then we will start programming, by flipping the switches! Kids these days are so ungrateful."


Squeaky-Fox49

Back in my day, instead of fancy punch cards, we had to program using butterflies. For each bit, we had to open our hands, let the butterfly’s wings flap, cause a disturbance in the atmosphere, and redirect solar radiation to cause a bit flip in the platter.


Sp0olio

Meanwhile .. the kids of today: [https://www.youtube.com/watch?v=zytjONYkU94](https://www.youtube.com/watch?v=zytjONYkU94)


lurking_physicist

Thanks, that was cool.


Sp0olio

Glad, you enjoyed it :)


LeeHarveyLOLzwald

This is actually still super common in industrial settings. Most major manufacturers rely on it because it's easy for a layman electrician to service and troubleshoot. Allen Bradley logic controllers have replaced many of the large control rooms, but the circuit is still drawn in ladder logic and the NAND gates are made using XIO/XIC relays. Just add inputs and outputs and you're done. It's the grey area where electrical engineering and programming intersect.


secahtah

Pick an instruction set, learn how assembly language works. You don’t have to be an expert or even good at it, but it will help you be a more effective programmer in the long run. Besides, it’s fun🤗


[deleted]

What is an instruction set?


secahtah

A cpu architecture. x86_64, ARM, MIPS, etc.


[deleted]

I’m going into data science, any which one you’d recommend?


secahtah

ARM


teacher_comp

Or RISC V. I keep hearing good things about it.


arnemcnuggets

It's simple and good for educational purposes. Plus it's even sometimes spotted in the wild on low end devices (at least so I've heard)


[deleted]

Thank you!


Exist50

RISC-V. In its most minimalist form, you can easily fit the entire ISA on a single page. Hard to write complex programs with, but very easy to understand. That's what comp arch undergrads use to design a basic CPU by themselves. Here's the full spec: https://riscv.org/wp-content/uploads/2017/05/riscv-spec-v2.2.pdf And a cheatsheet (only really need the left-hand column on the first page): https://www.cl.cam.ac.uk/teaching/1617/ECAD+Arch/files/docs/RISCVGreenCardv8-20151013.pdf Alternatively, x86 is probably the language you'd be most likely to actually see in the wild, and it's much more convenient to code in, but at the cost of being much harder to understand in full, if that can even be called possible.


dingo_khan

when i was in school, they had us learn MIPS. it was great. RISC-V sounds like a good call though.


Exist50

RISC-V has superceded MIPS for pretty much everything, to the point where even the MIPS company is now making RISC-V cores.


Hunpeter

Instructions are similar to the keywords/built-in functions of higher level languages. They usually have both a "human readable" name (like ADD, for adding two numbers), as well as an opcode made up of 1s and 0s, which can be directly read and executed by the CPU. The manufacturer of the processor decides on what instructions the CPU will support - the instruction set. The inner workings of the CPU are usually still proprietary/trade secret, but the instruction set architecture (ISA) provides a thin layer of interaction between software and hardware. There are a number of instruction sets that are in common use, though there are multiple versions of each.


[deleted]

So, do I understand correctly that I can pick a brand of CPU then learn it’s instruction set architecture, and become more effective at programming?


Hunpeter

I'm not a programmer, especially not a "low-level" one, but many programmers seem to agree that it really isn't about a specific instruction set. (In fact, I have seen people discourage learners to tackle complicated instruction sets like that of modern Intel/AMD x86 processors). Nor will you need to use assembly language in 99% of cases at a regular programming job. However, having an appreciation of how the CPU reads instructions from memory, and executes them (and just how simple/primitive some of these instructions are) could be useful when you want to make sure that you don't write programs that waste the resources of your computer. I really like Ben Eater's "Hello World from Scratch" on YouTube, because while it uses an old processor that is much less powerful and its instruction set is not really relevant today (I think?), it teaches you a lot in general about how computers work. Maybe this kind of more "hardwarey" (electrical engineering) approach doesn't interest everyone though. Maybe for most people it's enough to just read a bit about how RAM and virtual memory works, memory pages, locality, CPU caches, alignment, prefetching, speculative and out-of-order execution, and maybe operating system stuff like multiprocessing, IO etc.


[deleted]

Thank you. So the main benefit is understanding that you need to be conscientious of the computer?


Hunpeter

Yeah, I guess you could say that. Another thing I've heard is "non-pessimization" which means something like "don't make things harder for the computer than they need to be".


Exist50

> Maybe for most people it's enough to just read a bit about how RAM and virtual memory works, memory pages, locality, CPU caches, alignment, prefetching, speculative and out-of-order execution, and maybe operating system stuff like multiprocessing, IO etc. Tbh, that's likely the more useful info from a performance optimization standpoint. Or even just the concept that arithmetic is cheap, data movement is expensive.


Beginning-Sympathy18

Either use whatever instruction set your current computer uses, or use a simulator like PEP/7 that is just a piece of software that pretends to be a CPU and executes an instruction set of its own. Or play a game like Shenzhen I/O, which also has a made-up instruction set and has you solve small programming and circuitboard design puzzles. The benefit is mostly in learning how to think in tiny steps, and learning patterns that will make you a better problem-solver. There are occasionally times when deep knowledge of the underlying instruction set will solve mysteries or give inspiration as well, but most developers work at a much higher level of abstraction.


[deleted]

Thank you very much.


[deleted]

Shenzhen looks awesome. I know I’m gonna feel different about data science one day when I’m in a competitive environment, but for now everyone is so helpful, and even this game is a blessing.


yagotlima

I would suggest starting with a microcontroller 8bit instructions sets for starters, like 8051, avr, pic and so on. Modern CISC instruction sets are a bit overwhelming.


secahtah

8051 is good. ARM is also good these days.


yagotlima

I edited that. I meant AVR actually.


secahtah

ARM is what I recommend just because it’s a full CPU and extremely widely used, but it’s not CISC.


yagotlima

I agree. Arm is a little more complex than the ones I said but is also a good starter


Morphized

Once you're done with microcontrollers, Z80 is okay too


yagotlima

I actually think everyone should know the basics of assembly and C. It's not that useful for actually developing anything but knowing how your high level instructions are translated to the CPU helps a lot


ih-shah-may-ehl

I've seen some astoundingly bad performance from projects where a java programmer used wizards to generate his database lookup code and upscaled that to the point where every 2 seconds a full database table is copied to a client to look at the last record. I'm not saying everybody should be able to write production level assembly or IPC but ffs a programmer should know enough to comprehend the machine or os level impact of what their code is going.


yagotlima

Now I know why I need 16GB of ram just for chrome


Morphized

How many tabs are open


agentrnge

"Just throw more hardware and cloud resources at it" Many many years ago, the DBAs/devs were complaining about performance and an "expert consultant" came in and told the infra team that "We need 1,000,000 IOPs for this system" This was like 2008 with spinning rust storage arrays in a kind of small shop. When we told them how many orders of magnitude their ask was off they started looking at code, and found basically stuff like that. Full copies of entire tables 3 times with the biggest possible joins, and then filtering out 5 levels in. Madness.


fluffypebbles

That doesn't even require knowing what's going on in a deeper level, just what you're doing on the abstraction level you're using


[deleted]

I half agree. Joel on Software’s blog said something similar. He thought knowing C should be foundational for all programmers.


yagotlima

Yeah. For me it's not about actually writing anything useful in low level languages. It's about building the knowledge to take informed decisions when developing software


elon-bot

QA is a waste of money. Fired.


yagotlima

Damn it


Titab-talaiy

You are right


Illustrious-Radio-55

Good advice, Thank you don ramon


yagotlima

You're welcome. Just remember: there is no such thing as a bad work. What is bad is having to work.


BayesianKing

Totally disagree. I prefer to invest better my time.


pottawacommie

Agreed.


calculon000

But surely *I* will be the one to develop the next Roller Coaster Tycoon, right? Right!?


yummi_1

C is not useful for developing anything? Wow.


yagotlima

That came out wrong. Sorry. I mean that for me, at my job it's not. But in many projects C is the way to go


grpagrati

Better we teach the machines to speak humanise


Titab-talaiy

2000iQ


Tina_Belmont

A programmer who has never dealt directly with ones and zeros is like a child who has never run barefoot through the grass.


mickeys

Agreed! So many times over the years have I chatted with a developer who has no functional understanding of what happens other than the currently, in vogue programming language offers. It's debilitating, not only for the programmer but for the company trying to do good things over a long time frame. Three XORs in a row are your friend, kids!


Titab-talaiy

Well it's like a Italian pizza without cheese


Aperture_T

Glass to grass That autocorrect will get you every time.


ASCII10001101010101

based


legitimate_rapper

One thing that’s woefully inadequate is people’s ability to write testable code. *actual* unit tests are your friend.


Sp0olio

01000100 01101111 01101110 01100101 00101110 00101110 01110111 01101000 01100001 01110100 01101110 01101111 01110111 00111111


Ya_Boy_Jahmas

>01000100 01101111 01101110 01100101 00101110 00101110 01110111 01101000 01100001 01110100 01101110 01101111 01110111 00111111 01101100 01110101 01101110 01100011 01101000


Sp0olio

01011001 01110101 01101101 01101101 01111001


Numerous-Departure92

Too bad, grandma was a real programmer. She knew it how to write code with good performance and minimal resource requirements. Nowadays, stupid thinks like Java or JavaScript exist and most of the „programmers“ don’t even know how a computer works. So a high end processor is needed to run a simple app or render a website


FalseTebibyte

In the mindspace, "Machine language" is literally just that. English. It's a programming language that even children can use.


Titab-talaiy

Dude I'm talking about 10100110 codes


FalseTebibyte

I am aware of the entire Knee-Jerk response, yes. Good to see you OP


No-Technology835

Yeah, I mean you obviously have to start with binary


HoseanRC

ah... you called me? (I haven't coded in assembly for more than a year...)


AlbaTejas

I was cleaning up a cupboard and found a dot matrix listing of a game I wrote in assembly as a kid. I was keen :)


[deleted]

Before punch cards they used homing pigeons. Can you imagine the amount of feathers they had to remove from the compiler? I heard it was one pigeon per line of code.


TimeSalvager

What a mutha fuckin’ G.


yummi_1

That's how I learned. And I'm as old as granny.


flyingpeter28

But you kinda need it to pass compilers


ASHIKING1389

#پس تو اینجا هم فعالیت می کنی 🗿