The fact this was apparently posted by someone from the Netherlands makes this so much funnier.
The fact this was apparently posted by someone from the Netherlands makes this so much funnier.
Some applications use those unused bits to add tags to pointers but it’s important to mask those out before attempting to dereference the address. I’m not sure about ARM but x86-64 requires bits 49-63 to be copies of bit 48 (kinda like sign-extension), ironically to ensure that no one is using those bits to store extra data.
That’s one of the fundamental disagreements between Catholics and Protestants.
A Catholic would argue that veneration of saints isn’t worship, it’s showing respect for someone who exemplified Christian ideals, or died as a martyr. Canonization is basically the religious version of the Medal of Honor.
A Protestant would argue that the distinction between veneration and worship is arbitrary, and veneration of a saint essentially amounts to idolatry anyway.
Well shit, that’s a non-starter then.
Removed by mod
I actually added detail that wasn’t already discussed in the article?
We don’t even have true 64-bit addressing yet. x86-64 uses only 48 bits of a 64 bit address and 64-bit ARM can use anything between 40 and 52 depending on the specific configuration.
The person who correctly guesses when the AI bubble is gonna pop and shorts Nvidia stock is gonna make a lot of money. Call it The Big Short 2: Electric Boogaloo.
I’ve recently been thinking a lot about the recyclability of plastic. I have several stacks of plastic drink cups from various fast food joints in my kitchen; as much as possible, I try to save up and bundle together similar types of plastic before I throw it in the recycling bin, to try to save some sorting effort. And in doing so, I noticed something.
The thing is, a lot of single-use plastics have very similar properties. PETE, HDPE, Polypropylene, solid polystyrene, they’re all used to package similar or identical products. I think they’re more or less interchangeable, and the choice of a given plastic for a given application has more to do with cost, availability and the preferences of the product engineer than any specific material properties of the plastic itself. There’s obviously going to be some exceptions, but I think those are going to be few and far between, and a lot of them could be addressed by switching to other materials.
I think a great first step would be for regulators to encourage/force industries to standardize on one or two types of plastic at most, and eliminate plastics that aren’t worth recycling, like polystyrene. That should reduce the manual labor required by a significant amount once the other plastics are eliminated from the waste stream, and make it feasible to recycle plastics locally instead of shipping them off to a third world country.
I think companies should be taxed or otherwise penalized for the plastic waste they foist on consumers, because often there’s little choice involved unless you want to boycott a company entirely. If I wanted to eliminate plastic cups from my life, I’d pretty much have to stop getting fast food altogether (yes I know I should probably do that anyway, but that’s beside the point). A tax on bulk purchases of plastic may end up being passed down to consumers, but the revenue could be put towards subsidizing production of more renewable materials.
I think food stamp programs could be a strong driver for change on this, as they could refuse to cover products that generate excessive waste. With enough warning, there should be enough time for companies to switch their products to be compliant with little disruption to the consumer.
America doesn’t do anything big unless it’s to beat either China or Russia. Maybe this collider will be the impetus we need to build a bigger one.
It is being used to develop a quantum compass – an instrument that will exploit the behaviour of subatomic matter in order to develop devices that can accurately pinpoint their locations no matter where they are placed,
[…]
The aim of the Imperial College project […] is to create a device that is not only accurate in fixing its position, but also does not rely on receiving external signals.
These statements imply the device can know exactly where it is in space just by measuring some purely internal quantum effect, which conflicts with the principles of Lorentz invariance and relativity.
Both are constructed around the same idea that there’s nothing special in the laws of physics that changes with where you are or how fast you’re going. That observation is what led the conclusion that the speed of light is the same in every reference frame, and to Einstein developing the theory of relativity.
In reality, the device needs an external signal to learn its initial position. And it’s unlikely to be perfectly accurate so it may still need periodic updates, just hopefully a lot less frequently.
The London Underground is actually kind of a dumb use-case because it’s fixed infrastructure. You can just have something like RFID tags around the track that the train reads as it goes by. And there’s going to be sensors in the track that report trains’ presence to a central control room. It’s just a good setting to test the device.
What it’s really potentially quite useful for is nuclear submarines since they can stay underwater pretty much as long as their food supplies last, and knowing their position without using sonar or being able to receive GPS signals is quite important for navigation and obstacle avoidance. But the author was probably told to downplay potential military applications.
The article describes the device working in ways that violate relativity, but the actual technical description is a lot cooler.
It’s not a quantum compass, really. It’s a quantum accelerometer and gyroscope. The hope is that its accuracy will lend itself to long-term inertial guidance, which normally needs regular GPS updates to correct errors which accumulate over time.
If that’s WolframAlpha Classic, you probably paid for it a decade ago like I did.
I paid like $5 for the Android app (now WolframAlpha Classic) like 10 years ago and it’s been worth every penny. I use it for anything that needs complicated unit conversions.
WolframAlpha will do the right math, and walk you through it (though IIRC you have to pay for that part).
This is because LLMs do not inherently understand math. They stick characters together that are likely to go together based on the content they were trained on. They’re literally just glorified autocorrect.
If you want a tool that can actually do math from natural language input, try WolframAlpha.
Given how much modern games stream data in and out of VRAM, I think it would actually be quite a significant issue. Although, for modern games the 520M would probably be below minimum requirements anyway. It was just to illustrate my point.
The processor it’s using is linked in the article: https://www.cnx-software.com/2022/08/29/starfive-jh7110-risc-v-processor-specifications/
It’s a system-on-chip (SoC) design with an embedded GPU, the Imagination BXE-4-32, which appears to be designed mainly for smart TVs and set-top boxes.
The SoC itself only has two PCIe 2.0 lanes on separate interfaces so you can’t use both for the same device, and one is shared with the USB 3.0 interface.
That’s not even enough bandwidth to drive an entry-level notebook GPU from over a decade ago. Seriously: the GeForce GT 520M, launched January 2011, wants a full PCIe 2.0 x16 interface. Same with the Raedeon HD 6330M. You could probably get away with just 8 lanes if you had to, but not only one.
The other commenter wasn’t kidding by saying you could get more power out of a Raspberry Pi 4. It’s even mentioned in the article.
The language definitely seems made up just to fuck with people.