Sometimes I forget …

This isn’t “normal” for most other people.
Sometimes I forget …
This isn’t “normal” for most other people.
Today marks the first time I’ve updated the BIOS in my Laptop since I bought it. When I initially powered it up, Windows 11 was “installed” (more like present, it took almost 2 hours for it to fully update itself before it was usable) so I used that to update the BIOS and various firmware the first time around. Shortly after that process was complete I formatted the drive and installed Arch Linux. Goodbye, Windoze!
The only problem with running Linux on a piece of hardware from Dell is that sometimes they don’t support anything other than crummy Windows BS.
This would typically leave users in a predicament when needing to do things like update their Firmware or BIOS with the latest versions. Luckily there’s a handy platform called Linux Vendor Firmware Service for things just like this! Most of the hardware in the machine supports being updated via this platform, but sadly, my system itself is not listed on the Device List. So while I can update my trackpad, fingerprint reader, USB hubs, and many other devices firmwares, I can not update the BIOS natively within Linux. This was disheartening, as I am not about to reinstall Windoze or make a DOS boot disk just to flash a BIOS update. While poking around on their support website, I came across this little gem:
All Dell systems from 2015 and later support flashing an updated BIOS from within the boot menu. Note: the BIOS flash ends in an .exe extension. Even though Linux cannot open it natively, the BIOS will deal with it properly.
Huzzah! Apparently, to update the BIOS, all you have to do is download the .exe file and stuff it onto a FAT32 formatted USB drive. Boot the computer while connected to power, mash F12 to get the One Time Boot menu, and leverage the handy option to Update BIOS. I can confirm this worked perfectly! Click for Detailed instructions
I wrote a blog post a short while back about why I loved Linux. I waxed nostalgic for a while about my history with the OS and how it impacted my life. In the same post, I talked about my then-current setup utilizing a MacBook Pro, but I managed to get a bit off track toward the end, ranting about Apple’s ever-ongoing dumbing down of Mac OS, surveillance of user activities, strangle hold on developers – as well as the general lack of open-firmware alternatives for mobile hardware. When I wrote that post, I was certain I wouldn’t compromise or accept a replacement for my current machine until all of the conditions I was looking for were met…
I compromised. Sadly, there is no perfect device yet. There are, however, some REALLY AWESOME devices that get pretty damn close to the pin. I consider last year’s Dell XPS 15 9530 to be one of them. Sure, it’s got some closed-source firmware and wacky audio amplifier issues, but it’s a beast with kick-ass battery life, a beautiful screen, and great build quality.
I had no intention of buying a new notebook this year, but it just sort of happened. While trying to focus and work through some AVD nonsense for a customer, my LG monitor began to flicker slightly. I tried to ignore it, but after a couple of hours it was starting to become more bothersome and my head and eyes began to hurt. After several attempts at resetting, unplugging and plugging back in, checking firmware and cables and etc., I finally gave up. There was no way I would be able to finish the work I needed to like this, so I accepted defeat and drove over to Microcenter to buy a new monitor. I should have known the danger waiting for me; every time I walked into that store in the past I’ve come home with something… additional. This trip was no different.
While ogling the new monitors and trying to decide between OLED or IPS, I noticed I happened to be standing right next to the notebook section… of course my eyes wandered. You know you’d look, too. After I settled on an open box Dell U4025QW, I turned my attention to the laptops. I was immediately drawn to the XPS 15. I’ve had an XPS 15 before, but that was years ago when Microsoft still sold them at their retail stores (RIP). The new model has a gorgeous OLED display driven by an RTX 4070, 32 gigs of ram, and a 20 core i9 processor… it didn’t take long for me to convince myself, or the sales guy, that I needed it.
I still had work to do once I got home. I put the new laptop in the kitchen so it was out of sight and out of mind (hey, I tried) while I wrapped up my tasks for the day. The second I was finished with work, though, I dove into the deep end setting up my new machine.
I pressed the power button. The OLED screen displayed a blazing white Dell logo in the center of a perfectly dark black expanse… I drooled. Seconds later I was greeted by the Windows 11 setup screen. It had been a long time since I’ve set up Windows. Wow, it’s total garbage these days. It took forever to get through OOBE. Updates, reboots, more updates, ads and offers, ugly start layouts, more ads… What a mess. Had I not wanted to ensure it had the latest firmware updates from Dell, I would have skipped the Windoze shenanigans entirely. Sadly, most people seem to run Windows, so Dell provides the best support for updating using that platform. Once I confirmed my system was fully patched and ready, the real fun began.
I grabbed my favorite thumb drive and dropped the latest Arch Linux ISO onto it. I just wanted to test Linux on this hardware, so I wasn’t particularly thorough in choosing secure options during the install (encryption could wait until I knew the thing would even boot). I’m still in shock. Not even a full ten minutes later, I was sitting at the KDE Plasma desktop. Amazingly, everything appeared to “just work” right out of the box! I’ve never had a machine this easy to install Linux on. That’s when things got a bit more serious: I realized Microcenter had a 14 day return policy for this machine. I had 14 days to decide if I was keeping this thing and I was going to make it – and Linux – my daily driver for the next two weeks. I would give it everything I had to finally ditch Mac and Apple’s ever evolving circle-jerk of bullshit.
That was 30+ days ago. I’m sitting here typing this post from the XPS. I sold the Mac to a buddy and ended up back at Microcenter to buy 64 gigabytes of RAM and a 4TB Samsung 990 Pro NVMe SSD to trick this machine out. I’ve reloaded Arch a couple of times to rectify some of the poor initial decisions (hello, LUKS), but have finally settled on a stable, maintainable install that I am satisfied with. Without Microsoft, Apple, or Google spyware.
Continuing our journey to install and configure DoD Certificates on various platforms – this time I submit for your reading pleasure: Arch Linux!
This one is relatively easy. Like other platforms, trusted roots can live in a few places on Arch (Personal NSS or System stores, as well as some various Apps that have their own keystores). More info on that here. I like to dump these certs directly into the System’s trust anchors because I’m lazy and I don’t want to fuss with anything else. To do that, simply run:
./add-dod-certs.sh /etc/ca-certificates/trust-source/anchors "sudo update-ca-trust"
If you blindly ran that code block you might now be wondering where to get the add-dod-certs.sh script from. Have a look here: https://gist.github.com/AfroThundr3007730/ba99753dda66fc4abaf30fb5c0e5d012#file-add-dod-certs-sh
As always, review any script you get from someone else! At the time I reviewed it, there was no monkey business. 😉 Good luck!
Trying to use your CAC on a Mac? Don’t want to run some sketchy compiled app to install DoD Certs on your box? Check this handy scripty-doo out. It grabs the latest PKI zip, unpacks it, converts the certificates into a format that works and then installs them into the system’s trust store. Be prepared to either type your password a zillion times or use TouchID to modify the trust store – thanks, Apple!
#!/bin/bash
set -eu -o pipefail
export CERT_URL='https://dl.dod.cyber.mil/wp-content/uploads/pki-pke/zip/unclass-certificates_pkcs7_DoD.zip'
# Download & Extract DoD root certificates
cd ~/Downloads/
/usr/bin/curl -LOJ ${CERT_URL}
/usr/bin/unzip -o $(basename ${CERT_URL})
cd $(/usr/bin/zipinfo -1 $(basename ${CERT_URL}) | /usr/bin/awk -F/ '{ print $1 }' | head -1)
# Convert .p7b certs to straight pem and import
for item in *.p7b; do
TOPDIR=$(pwd)
TMPDIR=$(mktemp -d /tmp/$(basename ${item} .p7b).XXXXXX) || exit 1
PEMNAME=$(basename ${item} .p7b)
openssl pkcs7 -print_certs -in ${item} -inform der -out "${TMPDIR}/${PEMNAME}"
cd ${TMPDIR}
/usr/bin/split -p '^$' ${PEMNAME}
rm $(ls x* | tail -1)
for cert in x??; do
sudo security add-trusted-cert -d -r trustRoot -k /Library/Keychains/System.keychain ${cert}
done
cd ${TOPDIR}
rm -rf ${TMPDIR}
done
Since joining Microsoft I’ve worked remotely for nearly 9 years. My current setup consists of a MacBook Pro, a Thunderbolt 4 Dock, and an LG 34WK95U-W (which is an UltraWide 5K2K Nano IPS LED Monitor). I run a Windows 11 ARM VM via Parallels full-screen on the MacBook (basically for Teams and Outlook, so I don’t have to allow the Death Star to manage my physical device via enrollment), and then run other useful apps and workflow on the external monitor.
With the introduction of Azure Virtual Desktop (AVD hereafter) I’ve now come to rely on the use several systems to get my work done – especially being so tightly integrated with the customer that I access their systems through AVD. I use Teams within their organization equally if not more-so than I do within Microsoft. 😅
One of the things I love about AVD is App Streaming. Rather than having to connect to yet another VM Desktop, individual Applications can be published to me and appear almost as if they’re native on my device (despite having FUGLY Windoze styling) by streaming the App window itself. This works GREAT for Teams!
This has been an amazing boost to my workflow, but there’s been a crappy behavior that’s been nagging at me. Every time I click into one of the AVD Remote Apps, the built-in screen on the Mac switches away from the space containing the Windoze VM back to the MacOS desktop, no matter which Screen or Space (or Monitor) the Remote App is on.
I was certain Mission Control and Spaces had something to do with this, but nothing I changed in the relevant settings corrected the behavior.
Luckily, a Kagi search turned up a useful thread regarding similar complaints from nearly TEN years ago! https://discussions.apple.com/thread/4995042?sortBy=best This solution was originally for MacOS Leopard! Thankfully, it works even on Sonoma 14.5!
defaults write com.apple.dock workspaces-auto-swoosh -bool NO
killall Dock
With this change things operate a bit differently; I find where before I could use rapid app switching (like Alt+Tab on Windows, Cmd+Tab on Mac) it would automatically bring the selected app to the forefront on an available Space, it no longer does. I have to manually flip between spaces, but this is MUCH better than the unexpected and wacky automatic jank from before!
Hopefully this helps someone else! 🙂
So you thought you were going to be slick and save some time by using the Marketplace template to deploy “WordPress on App Service” in Azure Government? You likely even selected the ‘Recommended’ tick-box to offload media content to Azure Blob Storage via the template add-on section and then hit “Review and Create!” But when you logged finally in and created your first post with some media, you noticed an error message that content couldn’t be uploaded to Azure Blob Storage using W3-Total-Cache plugin for WordPress… You poked your way through the WordPress configuration to the Plugin Settings and found this section:
Yours DOES NOT look like mine above. 😉 (I forgot to get a BEFORE screenshot. Woops) Despite “or CNAME” box having the correct “usgovcloudapi.net” base DNS name in that box, when you click Test Microsoft Azure Storage upload – you receive something to the effect of: Unable to resolve DNS name <storageaccountname>.blob.core.windows.net.
The problem? The w3-Total-Cache plugin is using the deprecated Azure Storage PHP Client Libraries that appear to have a hard-coded service URL that only works in Azure Commercial. If you modify line 54 of the Resources.php file from blob.core.windows.net to blob.core.usgovcloudapi.net, you’ll have a lot easier time! You’re welcome! 🙂
I hate companies that make users think they’re getting something for free but then have these hidden functions that report back to the mother-ship what users are doing with them. WordPress’s parent company, Automattic, doesn’t have the best reputation as of late (check out the deal they made to sell your data to OpenAI). Just now I was prompted to “Verify your admin email address is still correct” while logging in. That seems innocent enough – and even mildly HELPFUL… but if you knew what Automattic was REEEAAAALLY doing, you would think differently. When you click that cute little blue button to “confirm” your information is correct it’s not just saving your email address locally (you DO host WordPress yourself, right?).
119 6 veth119i0-OUT 11/Apr/2024:18:06:40 -0400 policy DROP: IN=fwbr119i0 OUT=fwbr119i0 PHYSIN=veth119i0 PHYSOUT=fwln119i0 MAC=bc:24:11:60:5d:f6:22:e5:10:fa:b2:af:08:00 SRC=172.16.13.32 DST=198.143.164.251 LEN=60 TOS=0x00 PREC=0x00 TTL=64 ID=61509 DF PROTO=TCP SPT=46840 DPT=443 SEQ=1268854965 ACK=0 WINDOW=62720 SYN
Above we see the web server being denied a TCP connection to an HTTPS endpoint (thank you, outbound firewall). Let’s reverse lookup that IP to see who it belongs to…
WordPress, you sneaky little shits! What a clever way to hoover up user data! We can only assume they were going to guzzle that email address and who knows what else.
In order to keep your data safe from these jackasses, I’d suggest hosting WordPress in a manner where you can control both INBOUND and OUTBOUND traffic flows. I do this with the built-in firewall of a piece of software called Proxmox – an open source virtualization platform.
When I spin up a Linux container, I like to set some VERY strict rules for how it can access the network. In my case, both Input and Output policy for the firewall are set to DROP. This allows me to be very granular when permitting only the traffic necessary.
For this container I allow only outbound access to perform DNS lookups and to my local proxy service. Inbound HTTP and HTTPS are only permitted from the Proxy. Anything else not explicitly defined hits those default drop rules. When it’s time to perform software updates, I stop the web server in the container before temporarily toggling the Output Policy to allow traffic so that WordPress doesn’t sneakily send anything while my guard is down.
I hope this has been enlightening!
Because Windows and MacOS both suck.
I’ve dabbled with Linux since I was a kid. I remember asking my uncle to buy a copy of Red Hat Linux (I think it was Red Hat 5?) on CD from eBay for me when I was just a middle-school aged nerd. We had painfully slow dial-up internet at the time, downloading it would have taken a millennia, and I wasn’t old enough to have the means to buy things on the net by myself.
Thinking back, Red Hat really started it all for me – and got me into a lot of trouble, too. I accidentally wiped the hard drive in our family computer – a fancy Gateway 2000 machine my stepmom’s parents gifted us – while trying to re-partition the drive to dual boot it. I remember my father’s face and that vein on the side of his forehead throbbing like an alien was about to burst out of him. He said something along the lines of “if you can put it back, I won’t beat your ass.” It didn’t take long before I had Linux dual booting with Windows – and escaped, ass unbeaten! The best part of this outcome was that no one else in our house had a clue what Linux was, it was like I had my own computer whenever I rebooted into it!
My first hurdle was figuring out how to interact with hardware that didn’t work natively in Linux. That first piece of hardware was the modem! Our machine had a WinModem, which worked fine in Windows but almost appeared to be non-existent in Linux. I remember when I had finally figured out how to initialize it and get it to dial out! I had to issue dial strings directly. ATZ to initialize the hardware, ATM0L0 to set the volume to ‘0’ so my parents wouldn’t hear the screeching at 3am when I’d sneak downstairs to tinker, and ATDT <number> to dial. This was also when I stumble into ‘hacking’ for the first time. When the modem finally connected to our ISP, instead of dropping into PPP and exchanging data automatically like in Windows, the connection stayed in SLIP mode. I remember seeing “Ascend MAX Advanced Pipeline Terminal Server” on my screen, being frustrated that I was “connected” but couldn’t exchange data, not understanding what had just fallen into my lap.
I remember poking around sheepishly the first few times I dialed in, seeing what commands were available, and then being amused that I could list other user sessions. One day it occurred to me to search for my friend who lived across town… who was connected… until I disconnected him! I messaged him on AOL IM “Check this out” before disconnecting him several times until he realized I was the one doing it and biked over to our house to see how.
After some harmless fun messing with friends who used the same ISP, I created several accounts in the system that I’d end up using from then all the way through college years when I needed “emergency” connectivity for “reasons.” Those accounts, though, eventually got my father’s computer confiscated and me unable to be alone, unsupervised, in a room with a computer at school for two years.
In my college years I mostly played with Mandrake and Slackware distributions (my friend Cassie’s computer even saw quite a few iterations of those). Several of my floor-mates were also CSE majors, and friends sometimes jokingly referred to us as “The Linux Fascists.” To this day I’m still not sure what they meant by that. In the time after college I settled down and stuck with Debian for many years. Linux, then, was mostly for utility. I ran it on a NAS, a firewall, and on a couple of old machines that were only ever powered on when I was bored and wanted to tinker. My main machine was a PowerBook “Pismo” I had rescued while dumpster diving. It had a dead LCD panel, but worked fine when plugged into an external monitor. Replacing the panel was a piece of cake! MacOS X was AMAZING back then – sure, it was slow at first in the early days, but it was still way better than the dumpster-fire its descendant has become.
Linux also got me my first IT job. At age 19, while working for a “media production company” as a receptionist – the owner learned of my computer science background and offered me double time to come set up the office and storefront of a new business they were opening. I drove across the city on a Saturday morning to collect on that sweet, sweet payche… opportunity. While we were there working on getting the registers and network set up, the media company’s Ad server died (it was running a huge campaign for HBO at the time – basically paying everyone’s paycheck) and he looked at me and said “You said you know Linux, right? Can you go help get that back online? Without it we’re hosed.” Needless to say, I fixed the Ad server. Two weeks later, I took over as the lead of network infrastructure.
These days I spend a lot of time mucking about with other platforms and abstractions. Most Azure customers I work with run a lot of Windows workloads (or just web / cloud / data stuff). Despite working for Microsoft, my only real interaction with Windows is on my work computer. I absolutely hate it. The sheer amount of telemetry data Microsoft collects is atrocious. Users should revolt!
My desktop is a custom built gaming PC that used to run Windows before I got fed up and switched it over to Arch. (yes, yes, I run Arch Linux… don’t be toxic about it) For my mobile compute needs I’m still rocking a MacBook Pro, though likely not for much longer. I’ve got a previous-gen 2023 14″ M2 Max with 64 gigs of RAM… that I honestly thought would last me forever. The problem? You guessed it – the dumpster-fire known as OS X. I refuse to upgrade beyond Ventura (it took nearly a year for me to get here from Monterey!) because of the continued dumbing-down of the platform and intrusive telemetry and data collection (thankfully LittleSnitch can easily block all of this crap). I’ve contemplated loading something like Asahi, but it’s still early days and I’m not able to live without Thunderbolt and external displays on my notebook.
I’m still looking for alternatives. I want a mobile machine with equally sexy hardware and form factor, but with open firmware and hardware that works with Linux. And it needs to be reliable and from a reputable company. Sadly, that seems to be a tall order. Sure, lots of modern, sexy hardware can run Linux, (Hellooooo X1 Carbon) but closed source firmware is a HUGE turn-off for me. Framework laptop is fugly. Malibal is a SKETCHY company (see their review controversy). System76 … ehhhh I just don’t care for. What’s that leave? A StarBook? With a 1920×1080 display? My phone has more pixels!
Woops. Somehow I managed to get a little off track onto a mobile hardware rant, but the point is – Linux is where I started, it paved the way to where I am, and while I may have taken a brief excursion off the path to see what all the other platforms were about, it’s where I choose to be.
Pfffffffft! Dafuq? No thank you, Microsoft. That’s getting #disabled right TF now. (See Microsoft official documentation)
az config set core.collect_telemetry=no