The issue isn't static logic. The issue is divorcing instruction decoding from instruction set design to attain performance goals not originally built into the ISA.
It takes, for example, several clock cycles just to decode x86 instructions into a form that can then be readily executed. Several clocks to load the code cache. Several clocks to translate what's in the code cache into a pre-decoded form in the pre-decode cache. Several clocks to load a pre-decode line into the instruction registers (yes, plural) of the instruction fetch unit. A clock to pass that onto the first of (I think?) three instruction decode stages in the core. Three more clocks after that, you finally have a fully decoded instruction that the remainder of the pipelines (yes, plural) can potentially execute.
Of course, I say potentially because there's register renaming happening, there's delays caused by waiting for available instruction execution units to become available in the first place, there's waiting for result buses to become uncontested, ...
The only reason all this abhorrent latency is obscured is because the CPU literally has hundreds of instructions in flight at any given time. Gone are the days when it was a technical achievement that the Pentium had 2 concurrently running instructions. Today, our CPUs, have literally hundreds.
(Consider: a 7-pipe superscalar processor with 23 pipeline stages, assuming no other micro-architectural features to enhance performance, still offers 23*7=161 in-flight instructions, assuming you have some other means of keeping those pipes filled.)
This is why CPU vendors no longer put cycle counts next to their instructions anymore. Instructions are pre-decoded into short programs, and it's those programs (strings of "micro-ops", hence micro-op caches, et. al.) which are executed by the core on a more primitive level.
Make no mistake: the x86 instruction set architecture we all love to hate today has been shambling undead zombie for decades now. RISC definitely won, which is why every x86-compatible processor has been built on top of RISC cores since the early 00s, if not earlier. Intel just doesn't want everyone to know it because the ISA is such a cash cow these days. Kind of like how the USA is really a nation whose official measurement system is the SI system, but we continue to use imperial units because we have official definitions that maps one to the other.
Oh, but don't think that RISC is immune from this either. It makes my blood boil when people say, "RISC-V|ARM|MIPS|POWER is immune."
No, it's not. Neither is MIPS, neither is ARM, neither is POWER. If your processor has any form of speculative execution and depends on caches for maintaining instruction throughputs, which is to say literally all architectures on the planet since the Pentium-Pro demonstrated its performance advantages over the PowerPC 601, you will be susceptible to SPECTRE. Full stop. That's laws of physics talking, not Intel or IBM.
Whether it's implemented as a sea-of-gates in some off-brand ASIC or if it's an FPGA, or you're using the latest nanometer-scale process node by the most expensive fab house on the planet, it won't matter -- SPECTRE is an artifact of the micro-architecture used by the processor. It has nothing whatsoever to do with the ISA. It has everything to do with performance-at-all-costs, gotta-keep-them-pipes-full mentality that drives all of today's design requirements.
I will put the soapbox back in the closet now. Sorry.
I need a job. While I'm applying to PhD positions, I can't wait around for that.
Nothing makes the reality that I have to do this hit hard quite like paying out 10k of bills...
Recent M.Sc. in Mathematics (Number Theory). I can do mathematics, software engineering, etc. I write English passably, and can limp by in German.
Goettingen or remote... I'll move if I must but it'd be a heavy burden at this point. USA citizen in Germany, Czech citizenship in the works but not yet in the bag. This is 2021, surely remote is possible.
Temporary or freelance work also a possibility.
Ihr wollt wissen wie kaputt der Markt für SSDs gerade ist? Wir haben für 20 bzw. 40 SSDs angefragt. Sie sind 30% bzw. 300€ teurer als noch vor 2 Wochen. Wenn wir 40 nehmen kostet jede SSD nochmal 15€ mehr. Der Markt regelt. Mining anzünden, das belastet immerhin nur ein mal die Umwelt. #AdminLife
@voltur The short of it is that the “I just want to code” developer mindset is ripe for exploitation by businesses who are incentivized by shareholder and venture capital obligations to predate on and abuse the general public through the production of technology in the form of apps and websites.
Software and other knowledge workers have an obligation to become aware of the political, social, and ethical dimensions of their work and to think about the systems of power they reinforce.
Has anyone gone through the "working student" phase (it's a Germany thing, not sure about other countries) in IT security?
What items worked out for you in selecting a role to apply for? Any tips or ideas during your time at a company?
To give an idea: I haven't gone through this ever as I only study FT, did work about a year before moving to more sec focused work.
reading: Software That Doesn't Suck — Building Subversion
(I'm reading the transcript, you can probably also listen to it, if you're into that sort of thing)
You don’t want to maximize engagement with your hammer. That’s stupid. You don’t want to maximize engagement with your version control system. You just want it to do its job and get out of the way.
this, yes, this. this reminds me so much of every piece of VC funded Open Source tooling.