__dev

joined 1 year ago
[–] [email protected] 10 points 2 months ago (3 children)

Only until you have any other contributor, as you're then no longer the sole copyright holder. If you still want to work like that you'll need a CLA.

[–] [email protected] 2 points 2 months ago (1 children)

CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

[–] [email protected] 1 points 5 months ago

Wrong. Unified memory (UMA) is not an Apple marketing term, it’s a description of a computer architecture that has been in use since at least the 1970’s. For example, game consoles have always used UMA.

Apologies, my google-fu seems to have failed me. Search results are filled with only apple-related results, but I was now able to find stuff from well before. Though nothing older than the 1990s.

While iGPUs have existed for PCs for a long time, they did not use a unified memory architecture.

Do you have an example, because every single one I look up has at least optional UMA support. The reserved RAM was a thing but it wasn't the entire memory of the GPU instead being reserved for the framebuffer. AFAIK iGPUs have always shared memory like they do today.

It has everything to do with soldering the RAM. One of the reason iGPUs sucked, other than not using UMA, is that GPUs performance is almost limited by memory bandwidth. Compared to VRAM, standard system RAM has much, much less bandwidth causing iGPUs to be slow.

I don't disagree, I think we were talking past each other here.

LPCAMM is a very recent innovation. Engineering samples weren’t available until late last year and the first products will only hit the market later this year. Maybe this will allow for Macs with user-upgradable RAM in the future.

Here's a link to buy some from Dell: https://www.dell.com/en-us/shop/dell-camm-memory-upgrade-128-gb-ddr5-3600-mt-s-not-interchangeable-with-sodimm/apd/370-ahfr/memory. Here's the laptop it ships in: https://www.dell.com/en-au/shop/workstations/precision-7670-workstation/spd/precision-16-7670-laptop. Available since late 2022.

What use is high bandwidth memory if it’s a discrete memory pool with only a super slow PCIe bus to access it?

Discrete VRAM is only really useful for gaming, where you can upload all the assets to VRAM in advance and data practically only flows from CPU to GPU and very little in the opposite direction. Games don’t matter to the majority of users. GPGPU is much more interesting to the general public.

gestures broadly at every current use of dedicated GPUs. Most of the newfangled AI stuff runs on Nvidia DGX servers, which use dedicated GPUs. Games are a big enough industry for dGPUs to exist in the first place.

[–] [email protected] 2 points 5 months ago (2 children)

"unified memory" is an Apple marketing term for what everyone's been doing for well over a decade. Every single integrated GPU in existence shares memory between the CPU and GPU; that's how they work. It has nothing to do with soldering the RAM.

You're right about the bandwidth though, current socketed RAM standards have severe bandwidth limitations which directly limit the performance of integrated GPUs. This again has little to do with being socketed though: LPCAMM supports up to 9.6GT/s, considerably faster than what ships with the latest macs.

This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.

The only way discrete GPUs can possibly be outcompeted is if DDR starts competing with GDDR and/or HBM in terms of bandwidth, and there's zero indication of that ever happening. Apple needs to puts a whole 128GB of LPDDR in their system to be comparable (in bandwidth) to literally 10 year old dedicated GPUs - the 780ti had over 300GB/s of memory bandwidth with a measly 3GB of capacity. DDR is simply not a good choice GPUs.

[–] [email protected] 3 points 5 months ago

That's kinda true, in a sense that all batteries use a chemical reaction to generate electricity and a damaged battery can short and thus ignite arbitrarily. But there's lithium-based batteries like LiFePo₄ that burn significantly less intensely if at all; and there's lab-only chemistries that are non-flammable. So it's not really because of the lithium specifically that they burn so well.

[–] [email protected] 7 points 6 months ago

if youre boiling water, it can be any arbitrary temperature above 100.

That's not how boiling works. The water heats up to its boiling point where it stops and boils. While boiling the temperature does not increase, it stays exactly at the boiling point. This is called "Latent Heat", at its boiling point water will absorb heat without increasing in temperature until it has absorbed enough for its phase to change.

There is an exception to this called superheating

[–] [email protected] 10 points 6 months ago

100F is a fever; if you're experiencing those regularly you should go see a doctor.

[–] [email protected] 2 points 6 months ago (1 children)

Distributed ledger data is typically spread across multiple nodes (computational devices) on a P2P network, where each replicates and saves an identical copy of the ledger data and updates itself independently of other nodes. The primary advantage of this distributed processing pattern is the lack of a central authority, which would constitute a single point of failure. When a ledger update transaction is broadcast to the P2P network, each distributed node processes a new update transaction independently, and then collectively all working nodes use a consensus algorithm to determine the correct copy of the updated ledger. Once a consensus has been determined, all the other nodes update themselves with the latest, correct copy of the updated ledger.

From your first link. This does not describe how git functions. Did you actually read the page?

The consensus problem requires agreement among a number of processes (or agents) for a single data value. Some of the processes (agents) may fail or be unreliable in other ways, so consensus protocols must be fault tolerant or resilient. The processes must somehow put forth their candidate values, communicate with one another, and agree on a single consensus value.

From your second this. Again this description does not match with git.

You're right in that automation is not technically required; you can build a blockchain using git by having people perform the distribution and consensus algorithms themselves. Obviously that doesn't make git itself a blockchain in the same way it doesn't make IP a blockchain.

[–] [email protected] 1 points 6 months ago (3 children)

Key word distributed ledger. Git repositories don't talk to each other except when told to do so by users.

I shouldn't need to explain why an access key is not a consensus algorithm. Seriously?

[–] [email protected] 8 points 6 months ago (1 children)

Well, I'm saying Circulor is most likely lying about their "blockchain" actually being a blockchain, or that they've pointlessly set up extra nodes to perform redundant work in order to avoid technically lying.

Blockchain is completely pointless without 3rd parties being part of the network. It's like me saying I run a personal social network for just myself.

[–] [email protected] 0 points 6 months ago (5 children)

Git is not a blockchain. There is no distributed ledger; no consensus algorithm.

[–] [email protected] 11 points 6 months ago (5 children)

Polestar uses contracts and audits to ethically source materials, not blockchain. It uses blockchain as a shitty append-only SQL database to (apparently) tell you where the materials came from. Let me quote from Circulor's website:

data can be fed seamlessly to the blockchain via system integration using RESTful Web Service APIs with security and authentication protocols

So the chain is private and accessible only through a centralized, authenticated REST API. This is a traditional web application. A centralized append-only ledger is not even a blockchain.

view more: ‹ prev next ›