Granted its version number stopped being meaningful to most people after they adopted Chrome’s numbering scheme, but it’s still a milestone worth celebrating. Yay, Firefox! But it’s also hard to avoid the headwinds the browser faces, with dwindling marketshare compounded by mismanagement and a seeming lack of focus and direction.
Firefox is the last holdout from Webkit/Blink/etc, and its vanguard role in ensuring an open and healthy web cannot be understated. Anyone who relishes its diminished position, downplays current issues, or ignores it entirely has a short attention span and an unwillingness to learn from history.
I’ve been a Firefox user (barring a few adventures with Mozilla’s Camino and SeaMonkey browsers) since the original Phoenix days in 2004. Those were dark days of the web, and Firefox was a breath of fresh air. The Mozilla Suite had all these other tools inherited from Netscape, but Firefox was simple and fast. People flocked to the browser for its tabbed interface, popup blockers, plugin support, and because it was something new after years of IE stagnation.
If you think about it, Chrome won for similar reasons. It was a complicated piece of software, but was sold with a simple interface and promises to include better integrations with tools people care about. Anyone who says pretty isn’t a feature doesn’t understand human psychology.
It’s detractors today claim Firefox stands no chance of usurping or challenging the functional Chrome monopoly it in any meaningful way. It’s funny, they said the same thing the first time, and were proven wrong. It might be able to do it again.
TNL is a new political party in Australia that seeks to attract disenfranchised LNP voters (the incumbent conservative coalition). I hope they’re successful; I’m not a voter of those parties, but they need to see that lurching away from moderate positions has consequences. But I digress!
I was checking out their policies page, and was intrigued by a linked whitepaper by Julian Webb called Digital Justice Strategy:
[D]igital justice policy is defined as a range of initiatives designed to advance equality of access to digital technology; to prioritise the digital participation of people who have been traditionally excluded from and marginalised by technology, and to provide citizens with individual rights and remedies appropriate to the digital age.
The paper proposes Nine Pillars of Digital Justice, the first three of which are preconditions, and the the latter are rights. I thought it was worth taking a look.
⒈ Access to digital services is a right, not a privilege
Yes. Everything is done online now. Those who don’t have Internet, either because it’s unavailable or too expensive, are at a steep disadvantage.
⒉ Key digital infrastructures should be owned and/or governed as public utilities.
I’m not sure. It’s true the Internet was successful owing to the fact no single corporation owned it, and that layers of it lend themselves to natural monopolies. Higher in the stack we have services so critical to the Internet’s function that it seems ridiculous to have them beholden to corporate mismanagement and Wall Street. Maybe we need to identify these and fund non profits as well.
⒊ Digital justice requires an educated and risk-aware population. Education in digital and information literacy is a key, continuing, government responsibility.
Definitely. Ditto financial literacy, which is increasingly connected with this.
⒋ Data subjects own their own data
Yes we should, along with all the corollaries that come from this.
⒌ Everyone has autonomy in respect of their technological choices.
⒍ No person should be discriminated against as a consequence of their technological choices.
I agree with 5, and access should be a right (from 1), but there should be minimum required specifications for connected hardware, like the phone and electrical systems. My credit union shouldn’t be prosecuted for discriminating against my Commodore 16, which is physically incapable of interacting with their systems securely (inb4 serial cables to ssh jump boxes)! If this presents a financial burden for disadvantaged people, the Government should step in and subsidise or provide them hardware for free.
⒎ The use of automated decision-making by governmental and public bodies to
determine the rights, entitlements or liabilities of any person should be transparent and subject to human oversight and review.
Absolutely; this is a bug bear of mine. Anyone against greater transparency should be judged with suspicion.
⒏ Any government service proposing to use an artificial intelligence (AI) system to deliver or augment their services should ensure that it complies with current best practice in respect of ethical AI design.
This would be a good start, but I don’t even think “best practices” are sufficiently understood. We should reverse the responsibility, which 9 addresses:
⒐ The use of facial recognition systems or similar biometric-based technologies for law enforcement and related functions shall not be pursued by any government, public or statutory body unless and until a proper system of legal safeguards has been put in place that takes due regard of fundamental human rights.
Sounds good. Give people effective recourse, and I’ll bet a lot of the issues we currently see would mysteriously resolve themselves. It’s funny how that works!
I’m glad to see people thinking more about these issues. The current coalition have been actively malicious, and Labor have been such a mixed bag as to be hopeless. These are points we should be debating.
(This was a draft post written a couple of months ago, before the latest musky buyout offer entered the fray. I think the rambling point here still stands though, so I’m publishing it).
Tech journalists have been in a lather over Twitter’s moribund market performance and user growth of late. The conclusion among the talking heads is clear: Twitter needs new management to be a going concern.
This is interesting to read in the context of Netflix, the video streaming service that announced they lost users for the first time. But while people are up in arms and talking about losses to their catalogue, the hostile introduction of ads, and cancelling important shows, the same assumption made about Twitter above goes unacknowledged.
Why do businesses need to grow ad infinitum? It’s not just a moral question, it’s one of simple physics. Even if desirable, is it possible?
The tech world is still trapped in the mindset that growth is the primary or only metric with which to gauge success. A stable, profitable business with happy employees and a good corporate culture is nonetheless shunned as being stagnant in IT, where it would be hailed as sustainable, or a bluechip elsewhere.
Whether Netflix is having a blip, or its part of a larger worrying trend remains to be seen. But if it’s the former, what’s the problem with a profitable company?
I suspect this is another symptom of the venture capital mindset that still dictates so much of IT. Some of us still haven’t learned the lessons of the first Dot Com Bubble, and that was two decades ago. Bummer, now I feel old in addition to confused.
The ERTMS is one of those technical things rail nerds and industry insiders hear about every few years, but it never seems to go anywhere. It’s been off the rails. It hasn’t got traction. It’s been station-ary. Engines.
Integrating Europe’s railways into one cohesive, interoperable network with common signalling, communication, electrical, and management systems sounds great; even inevitable. Europe has integrated in other areas, so it stands to reason transport would also benefit, both for passengers and freight.
ERTMS was a bold proposal. It encompasses everything from standardised driver accreditation, to measurements and security systems. It even assets such commonality will reduce fleet prices thanks to economies of scale, and improve export possibilities.
The stakes also suddenly seem much higher. Effective intercity transit will be key to combatting climate change and oil dependency, including purchases from shitty regimes. As part of Ukraine’s European Union accession, I’d love to see their rail systems further integrated into this system. It will greatly assist with their post-invasion recovery. 🇺🇦
This mirrors the experience in Denmark, which followed Switzerland and Luxembourg in committing to a full rollout of ERTMS and to replace analogue relays with digital interlockings. Belgium, Norway and Luxembourg soon followed, and they have been joined more recently by the Netherlands, Italy, the Czech Republic and Sweden. Even Germany now expects to complete its rollout by 2040.
He quoted Matthias Ruete, Europe’s ERTMS coordinator:
Overall, we are pretty confident for the whole TEN-T network. There is only around 20,000km missing, which is why I think we were quite influential in convincing people that 2040 was the deadline to go for.
French rail systems have traditionally worried about adopting new systems owing to the price, though Kevin notes the EU has allocated funds to hasten its adoption.
The other question I have from a technical perspective is how such systems work. I see the word “digital” in places, and my head immediately drifts towards thinking what could go wrong, what could be compromised, and what standards are involved. There’s a lot riding on this being done right.
My late mum used to say you should start cooking dinner before you’re hungry, otherwise you make poor decisions about nutrition to satiate your immediate appetite. The same applies for grocery shopping on an empty stomach.
My dad used to say that Formula 1 drivers have to react to turns before they see them, or it’s already too late.
I only realised recently they were saying the same thing.
This is Suisei’s second appearance here on Music Monday, that ongoing series about music I always post on a Monday, except when I don’t.
Suisei’s performance of her 2021 hit during the Hololive Link Your Wish concert was everything I’d hoped it’d be. Her vocals at the tail end of the chorus are especially stunning, as are the music video graphics! ♡
This is a bit of a longer post explaining my adventures installing and testing an all-in-one (AIO) liquid cooler in my current build. If you want the executive summary: wow it’s good.
I won’t say it’s cool, I won’t say it’s cool, I won’t say… hey, Ruben, that AIO is pretty… it’s pretty… cool right? Get it, because it’s a… it’s a… cooler! Damn it.
I’ve been rebuilding my game machine and personal tower for a few months now, and the time came this month to review the CPU cooler. The Ryzen 5600X comes with AMD’s stock Wraith cooler, which is surprisingly decent at keeping the CPU below throttling temperatures. But it’s loud at full speed, and the CPU still runs hotter than I’d prefer.
My instinct was to buy another trusty Noctua vapour chamber air cooler, such as the legendary NH-U12A I have in my FreeBSD server. But I’d never used an all-in-one (AIO) cooler before, and thought it’d be fun… if utterly overkill for this budget CPU with only six cores.
AIOs are a great introduction to liquid cooling. They contain the radiator, tubing, pumps, coldplate, and fluid all pre-installed in a closed loop. You attach the coldplate to the CPU, then route the tubes to the radiator which you affix to the case.
Liquid has greater thermal density than air, meaning a correctly-installed and properly spec’d AIO can transfer heat more efficiently than air alone. Their far greater thermal mass compared to air coolers also lets them “soak” bursty workloads for longer before the fans need to speed up.
AIOs tend to be referred to by the cumulative size of their fans. For example, a 120 mm AIO has a single 120 mm fan, a 240 has two 120 mm fans, and a 280 has two 140 mm fans. Some AIOs can get as large as 360 and 420 mm.
Why I chose the Arctic
There are dozens of different AIOs on the market, but I went with the Arctic Liquid Freezer II 280 for a few reasons:
The name is so cool. I know it’s probably water and ethylene glycol, but it sounds like I’m introducing cryogenic fluid into my machine. This may have played more of a role in my purchasing decision than I care to admit.
Online reviews have it consistency outperforming other AIOs of a similar size, both in cooling capacity and noise. Most of this is due to having the thickest radiator available, and Arctic’s superior fans. The different pump design might also play a part.
It has integrated cable management, and connects to the motherboard’s CPU fan header via a single cable. No need for fan controllers or messy wires! That said, Arctic still expose the fan headers on the device with cables, so you can replace or flip fan orientations.
It doesn’t have das blinkin lights, infinity mirrors, or other visual gimmicks that can’t be easily disabled. I didn’t want to pay extra for features I’ll want to immediately cover up with tape.
It doesn’t require third-party controller software; it takes the PWM signal from the motherboard and that’s it. My Asus BIOS lets me be granular with fan curves, so this works well.
It’s the largest AIO that feasibly fits (unmodified) into a Cooler Master NR200P Mini-ITX case. Just!
For my first AIO, it was surprisingly easy to install. The Arctic box comes with a QR code that redirects you to their instructions, along with a video showing you the process, though be warned the video is missing the section where the brackets connect to the coldplate.
Unsurprisingly, the biggest challenge was routing the stiff cables within the tight confines of the NR200P case, and without pushing against the fans. I ended up with the tubes towards the front of the case, so I could route them down and around the radiator towards the CPU. I still recommend getting a metal fan grill for the fan pointing at the CPU just in case.
One thing I did do was mount the radiator one screw higher on the side panel to give the GPU below a bit more clearance. I’m also not happy with motherboard power connector to the power supply; I might get a longer one from Cablemod that lets me route it away from the AIO fan to give it more clearance.
Provided you have fans at the top for exhaust, channels like OptimumTech recommend the counter-intuitive approach of blowing air through the radiator and into the case. This way, the fans are pulling in fresh air, not hot air from the GPU mounted below. It was easy enough to detach and flip the fans to pull air rather than push.
Here’s a top-down view of the case with the top fans and panel detached; note the sleeved tubing with the diagonal white stripes. Air flows fine, but the Arctic’s thicker radiator really is the limit you’d probably want to put in this case! The smaller 240 mm version has the same radiator depth, so you’re not saving much space going with that model.
There’s nothing surprising here, this cooler is amazing. Plenty of people have already done detailed reviews, and these are by no means scientific, so take with a mL of antifreeze from a day with 22 degree ambient temperature:
Idle from boot dropped from ≈55° to ≈25°. I knew Ryzen booted warm, but that’s wild (to use the technical definition).
Average loads went from ≈75° to ≈50° in games like Minecraft and Train Simulator. Again, not especially taxing, but being under that magic 75 means the fans run quieter… and they’re already barely audible.
Peaks dropped from ≈90° to ≈70°. I’ve yet to stress this CPU hard enough to get it past this! This oversized AIO is easily able to keep up with the thermal loads of this lower-end 65W CPU.
I suspect those temperature deltas wouldn’t be as stark if you were coming from something better than a stock cooler, and if you were using something closer to a 5950X or a 12th-gen Intel. But I’d consider this a success :).
My fear of the case heating up was also unfounded, and I got the same results with or without the dust filter which was encouraging. The AIO is able to wick away heat more efficiently and with much less noise than the stock cooler, and the top-mounted fans effortlessly circulate that warm air out of the case.
This experiment also validates Arctic’s fan design. I’ve been a loyal Noctua customer for more than a decade for personal and work projects, but these Arctic fans are whisper quiet and move a ton of air. I’d consider getting a mix of both manufacturers for future projects to play with different pressure profiles.
Building and tinkering with computers are fun! I’m glad I’m rediscovering all of this again.
I’ve used Vundle for years to manage my Vim plugins, but I saw vim-plug referenced in a plugin’s documentation, and thought it was worth a try. I don’t use many plugins, so the idea of a “minimalist” tool was appealing.
I added the following to my ~/.vimrc, with a larger and smaller plugin to test: