Can confirm. Fitbit Charge 6 band did that to me until I replaced it with this style:
aka @JWBananas@startrek.website aka @JWBananas@lemmy.world aka @JWBananas@kbin.social
Can confirm. Fitbit Charge 6 band did that to me until I replaced it with this style:
That bottom one looks embossed instead of printed. At the size of a USB-C cable plug, that’s going to be difficult to read outside of ideal lighting conditions.
The cable, not the package
Ultimately, it’s great that users won’t need to squint to read the fine print or cross-reference spec sheets once the labels gain popularity.
I can’t even read the labels on the cables in the article photos.
EDIT: I get it, you all have 20/10 vision and no astigmatism, thanks for your input.
CBD also allows for around a 600% improvement for the IOPS over Bcache.
RIP
Elian Gonzalas
At one point Fox canceled like 20 other shows, with Greg the Bunny being one of them, between the time Fox cancelled Family Guy, and the time they brought back Family guy.
I have no idea what you’re trying to say. It’s not a counter example. It is literally the example given in the article, which you quoted.
The awards are more heavily influenced by album sales than subjective judgements of musical quality.
Do you know who Jon Batiste is?
The album won on quality. The sales spiked after the win.
every old Nissan I see falls apart
That says a lot more about the owners than it does about the vehicles.
And without the context that the Ars article provides, that information means very little to the casual visitor. There is absolutely nothing on that website to provide any of that context. It certainly doesn’t say that by uploading your photo, you are agreeing to allow Google an irrevocable licence to use it to train AI.
The only thing there is an image that says “Take control” which just links to the author’s cloud storage company. This whole thing is thinly-veiled viral marketing.
Sure, but that still feels very “You agreed!”. The only place on that website that tells you “beforehand” is hidden in the terms of service. That’s literally no different.
The site (TheySeeYourPhotos) returns what Google Vision is able to decern from photos. You can test with any image you want or there are some sample images available.
…by submitting them to Google, who then keeps a copy of them and uses them for the exact same purpose which purportedly compelled the author to leave Google.
Better than using tabs as bookmarks
Let’s be realistic. How many devices support a mainline version of OpenWRT and have more than one 2.5 Gbe port?
This thing is primarily a wifi router and access point. The available Ethernet ports, which are limited to what the chipset supports, are going to be more than sufficient for the majority of users.
If your main concern is wired throughput to the Internet, you are not the target audience for OpenWrt. The literal point of the OpenWrt project was to be an open source firmware for the WRT54G wireless router. The project has of course grown since then, but that is still its primary intended use case.
You are much more likely to find what you need in pfSense/OPNsense/etc, and on more powerful hardware. I would be way more concerned with the fact that it only has 1 GB RAM.
But if you still want to take that stance, there is nothing stopping you from reconfiguring the 2.5 Gbe port as a VLAN trunk and hanging it off a managed switch. Put your uplink in one VLAN and your LAN in another. That is going to be more than sufficient to saturate the 1 Gbps fiber connection that most people have (or at least asymmetrically saturate the 2 Gbps connection that some people have).
Or if you don’t like that, just do the routing on the switch. If your primary concern is wired throughput, you’ll probably already be doing that anyway. Then just use this thing as an AP, in which case the one port is sufficient.
The included MT7976C wifi can theoretically saturate the 2.5 Gbps uplink on its own. The use case is overall throughput for a mixture of wired and wireless devices.
The US National Weather Service releases updated 84-hour forecasts every 6 hours. Even with supercomputers at their disposal, due to the computational complexity of simulating physics, that is their best possible effort.
Google, meanwhile, is “developing a machine learning model that it says can accurately predict weather in seconds – not hours – and outperforms 90% of the targets used by the world’s best weather prediction systems.” Using a single desktop computer, they can generate a highly accurate 10-day forecast in under a minute.
More information:
https://www.weforum.org/stories/2023/12/ai-weather-forecasting-climate-crisis/
Given this information, and given the enshittification of Google search, would you still make the same guess about their profit margins?
Sounds very You have been banned from /r/Pyongyang