It is a hardware failure. Screens are complex and sensitive parts that are exposed to a lot of (ab)use. What is cryptic about that?
we think you’d be best with a bigger team with a better support network
Sounds like they think you’re not independent enough for the position. If it is a small team, they might need someone who can immediately start being productive, while they think you will need more coaching to get up to speed.
No need to drag any disabilities into this.
Any liquid is drinkable, some even more than once!
To be fair, speed is relative. Imagine a plane flies at 500 km/h and is pursued by another plane at the same speed. If the first plane fires a rocket backwards that accelerates for a total of 200 km/h, then for an observer on the ground the rocket will still do 300 km/h, in the same direction as the planes. However, the guys in the second plane will see a rocket approaching them at 200 km/h.
Wind resistance, aerodynamics, etc. will have an impact, but it can work.
Am I missing something? I thought the outage was caused by CrowdStrike and had nothing to do with Microsoft or Windows?
That’s really useful to know. Thank you for sharing!
They say there are 16 screens inside, each with a 16k resolution. Such a screen would have 16x as many pixels as a 4k screen. The GPUs power those as well.
For the number of GPUs it appears to make sense. 150 GPUs for the equivalent of about 256 4k screens means each GPU handles ±2 4k screens. That doesn’t sound like a lot, but it could make sense.
The power draw of 28 MW still seems ridiculous to me though. They claim about 45 kW for the GPUs, which leaves 27955 kW for everything else. Even if we assume the screens are stupid and use 1 kw per 4k segment, that only accounts for 256 kW, leaving 27699 kW. Where the fuck does all that energy go?! Am I missing something?
AI is a field of research in computer science, and LLM are definitely part of that field. In that sense, LLM are AI. On the other hand, you’re right that there is definitely no real intelligence in an LLM.
LLM don’t have logic, they are just statistical language models.
That is true, but from a human perspective it can still seem non-deterministic! The behaviour of the program as a whole will be deterministic, if all inputs are always the same, in the same order, and without multithreading. On the other hand, a specific function call that is executed multiple times with the same input may occasionally give a different result.
Most programs also have input that changes between executions. Hence you may get the same input record, but at a different place in the execution. Thus you can get a different result for the same record as well.
That exact version will end up making “true” false any time it appears on a line number that is divisible by 10.
During the compilation, “true” would be replaced by that statement and within the statement, “__LINE__” would be replaced by the line number of the current line. So at runtime, you end up witb the line number modulo 10 (%10). In C, something is true if its value is not 0. So for e.g., lines 4, 17, 116, 39, it ends up being true. For line numbers that can be divided by 10, the result is zero, and thus false.
In reality the compiler would optimise that modulo operation away and pre-calculate the result during compilation.
The original version constantly behaves differently at runtime, this version would always give the same result… Unless you change any line and recompile.
The original version is also super likely to be actually true. This version would be false very often. You could reduce the likelihood by increasing the 10, but you can’t make it too high or it will never be triggered.
One downside compared to the original version is that the value of “true” can be 10 different things (anything between 0 and 9), so you would get a lot more weird behaviour since “1 == true” would not always be true.
A slightly more consistent version would be
((__LINE__ % 10) > 0)
They are very busy charging an arm and a leg for crappy software with shit support.
I’m in IT in the financial industry. There is indeed still a ton of COBOL around.
According to this article, an average smartphone uses 2W when in use. That number will largely be dependent on the screen and SOC, which can be turned off or be placed in a lower power state when the phone isn’t actively being used. (The 5W - 20W figure is for charging a phone.)
With 8 of these cells, you’ll have 800μW, or 0.0008W, and you need 2W. You will need to add a few more batteries… About 19,992 more. If 8 of these batteries are about the same size as a regular smartphone battery, you will need the equivalent of 2,500 smartphone batteries to power just one phone.
Too bad they don’t say how much the new batteries weigh! It would have been fun to see…
If we ballpark it and assume something the size of a regular smartphone battery is 50g (1.7 oz), then our stack of 20,000 of these new batteries could be about 125kg (275 lbs).
I won’t be replacing any of my batteries just yet.
I like your optimism, but no, they are actually serious.
Sadly, yes. On the off chance you speak Dutch, here is a fact-checking article on that exact ad. I know it’s a weird thing to link articles in uncommon languages, but I came across that article recently and thought it really provided a lot of context, so I’m afraid it’s the best source I have. You can always run it through a translator too :-)
That may be true for the exact hardware you used, and the exact tests you have done. For Microsoft the problem would be that they need to actively continue supporting older and older devices. At some point it makes sense to drop active support. If it works, that’s fine, but they won’t continue testing and fixing for unsupported configurations.
Phrased differently: Microsoft announces the end of support for a product. If you want to pay for it, they will make an exception and continue to support it just for you.
I understand people dislike Windows 11, but complaining about life cycle management isn’t going to help that.
It is more of a “For typical cards, expect very competitive options from AMD. If you want top performance, buy Nvidia.”
In theory it allows them to focus more on the cards that actually get bought, and thus they could make those cards better products.