

> OS virtualization relies on re-exposing the hardware and kernel primitives to VMs, no VM is reimplementing the Intel architecture for memory management, for example. Everything's encoded as DT files, so you can just use the configs already in the kernel tree and even do fun things like simulating physical hardware in a game engine. I also have a tool that's used in a couple companies able to rehost simpler linux device drivers and embedded firmware without source code modifications. Since the HW is fake, testing new configurations is just a config change away. Some of the published work even looks at the expectations encoded in the source to emulate what it's expecting from the "hardware". The direction people are moving towards is re-hosting and directed emulation, where you run the code in an environment where the analysis tools are better and emulate just enough of the platform that the code thinks it's executing natively. Kernel testing is hard for the same reasons kernel development alone is hard, you're building the thing that all other tools require to run. For example, if you try and test the memory subsystem, how do you manage the memory to actually run the tests? It's hard to test things that the runtime depends on.

Even if you could emulate the device and connect that to the driver, there's a lot of things that you won't cover (the communication path, the device misbehaving.) A lot of them are made by vendors, and the only way to test is against the devices themselves. Similarly, it's hard to test the drivers. How would you test the bootloader, for example? Or code that depends on a certain processor architecture? The only way is to actually run the code on that hardware. A lot of code paths are fully dependent on hardware that can't be mocked. However, the kernel is fairly hard to test because it's the kernel: , but they're not evenly distributed, and perhaps the insane cutting-edge code they write (and the less insane code I write) isn't necessarily dependable, since by definition they lie on the edge of human understanding. And I feel the maintainers are sometimes spread far too thin (Dmitry Torokhov manages a good chunk of Linux's entire input system, no wonder he lets mistakes slip by in the touchpad driver, I'm just disappointed I reported a bug to the mailing lists and never got a reply after like a month).ĮDIT: The wizards are out there, for example. Having had the misfortune of dabbling in reverse-engineering Windows drivers and debugging in-tree Linux kernel drivers, I now believe they're often messy hackjobs written by low-paid average-skilled embedded developers, no less fallible than the average enterprise Java programmer ( expresses a similar sentiment).

Hi Fluffy! If you see this I hope you're doing well.įor the longest time I viewed kernel programming as an exalted realm of "optimized bare-metal software written by visionaries and wizards" (like operating system core developers of the Dave Cutler variety). It gave an insight into the thoroughness of his nature though and showed that he was absolutely perfect for the (critical) role we were hiring him for where mistakes were extremely costly. It was weirdly offputting, but he was such a monster programmer that he was already a clear hire. I once interviewed a candidate who had printed off a whole list like this and insisted on grinding through the whole thing. If you ask a peer engineer the diversity questions you may well get quite a different answer from asking the HR/recruiting person for instance (eg one might give you a cultural insight vs the other might just give you the policy/factual answer). Don't shy away from asking 2 people the same question if you think you'll get different perspectives. Think about which questions are likely to be most enlightening when you ask each interviewer. Secondly, consider that if you do well you are likely to have multiple rounds of interviews. It's far better to have a good conversation about 1 thing which might affect your decision than a superficial sprint through a bunch of topics you don't actually care that much about.

As someone who has recruited hundreds of candidates at this point I would urge caution with lists like this.īy all means read the list, but rather than trying to grind through as many of the questions as possible, try to use it as inspiration and think of what you're actually curious about and which might tip you over one way or the other if you're on the fence.
