My name is Courtney Rosenthal. My pronouns are she/her/hers.
I am speaking on behalf of Open Austin. We are a local non-profit that addresses social and civic challenges through creative uses of technology.
I’m the project lead for our open data initiative. For over a decade, Open Austin has worked closely with the city on a number of issues, including the city’s open data effort.
Back in 2011, the city council adopted an open government resolution that called for the establishment of an open data portal. It was a bold new idea that government would aggregate and publish its data directly to the public. Open Austin worked closely with staff to encourage city departments to embrace this initiative and publish their data – as much as they can – on the portal.
In July of this year, the city completed an audit of the open data portal. It counted over 4,600 published assets – nearly a thousand more than any other city surveyed. Thanks to the work of many city departments, the original open data vision has been realized.
Unfortunately, the audit identified some concerns with the data being published.
Although we are publishing a large quantity of data, not all may be of value to the public.
Also, the audit identified issues relating to the quality of the data, such as: data missing, data dirty, published data not matching internal data, formatting and presentation issues.
The audit also found that Austin does not have a designated executive, such as a Chief Data Officer, responsible for the content in the data portal.
Open Austin supports two key actions to address these concerns. First, we support an update of the city’s 2013 open data policy to include objectives for the value and quality of data assets. Also, we support the appointment of a Chief Data Officer, to oversee this work.
Austin has successfully developed a leading open data portal. It is a significant resource to the community. Now, let us take the next step, to deliver increased transparency and value to the public.
Thank you.
]]>In August 2021 (my previous year’s letter is here), Juliette and I moved out of our house to begin a remodel. The contractor originally thought he might have us back in November that year. Ha!
We were in our temporary house until March 2022. We moved home as soon as our house was minimally habitable. That meant a bedroom and fully functioning bathroom – and not much else. It took until November – 15 months and 2 weeks from the start – for them to complete the last task and close out the remodel. We’re glad to be back and we love what was done, but, jeesh, that was so freaking stressful and disruptive.
Here’s a photo album that shows the before and after of our remodel.
In May, as soon as the city completed its final house inspection, we were off to the Austin Animal Center to adopt a cat. A grey tabby (his shelter name was “Gray Cat”) crawled right into my lap and started licking my hand. Of course, we had to bring him home.
So, meet Franklin. He’s a bit under two years old. He’s a hoot. He plays fetch with me. I’ve got these foam balls the size of a ping-pong ball. When I throw one, he chases after it and brings it back to me. Until he gets bored, at which point he just stares at me and expects me to do the fetching.
Juliette and I continue to work from home. Around September, my company closed its Austin office and made us all virtual workers. I’ve been doing that for two years by this point, so it really didn’t change anything.
I continue serving as a commissioner on the Austin Library Commission. It’s been a rough year for libraries, with all these news stories about book bannings and staff abuse. I had the opportunity to foster some good news in Austin. I chaired a working group created by the commission, to address the library threats. Based on our recommendation, the Austin City Council adopted a “Freedom to Read” resolution to support our local libraries and oppose book banning.
Speaking of libraries, from July to October, I spent Saturdays visiting various branch libraries, volunteering to register voters.
To perform that service, I had to take training and become qualified as a Volunteer Deputy Registrar for Travis County. Yep, all just so I could help somebody fill out a voter registration form. Stuff like this – and the lack of online registration – are some of the ways Texas engages in voter suppression.
I did this because I was really frustrated about so many bad things happening in Texas – the gun violence, the failed electrical grid, the attacks on trans kids – and I was hoping more voters might help turn things around. Alas, all the incumbents were re-elected in the state.
My pandemic hobby was learning to make bread. I still keep up with it. I’m not quite at sourdough or baguette level, but some of them have come out nice. Here’s a recent loaf.
In last year’s letter I talked about my withdrawal from Facebook. This year, I’m dropping Twitter. Yeah, it’s the Elon Musk thing. (As I write this letter, Twitter is hanging on by its fingernails. Who knows what shape it’ll be in when you receive this.)
You may have heard of Mastodon, which is a community-driven alternative to social media. I’ve become active there and I’m enjoying it very much. If you join, please connect with me @oh_that_courtney@hachyderm.io.
I’d love to hear from you sometime. My contact info remains:
512-573-5174 (mobile, text ok)
cr@crosenthal.com
Take care, and best wishes for a splendid 2023.
]]>Sept 2 update: Council opposes book banning, adopts Freedom to Read resolution
You can see the agenda item here: https://www.austintexas.gov/department/city-council/2022/20220901-reg.htm#086
Across the country and in the state of Texas, libraries are confronting aggressive challenges to materials in their collections. School libraries have been the primary target of these incidents. While public libraries are a lesser target, these challenges are occurring there too.
A list of 850 books, released by Texas State Representative Matt Krause, has fueled much of the book challenge activity in Texas. Although public complaints often cite pornography as the justification for challenge, most of the challenges revolve around issues of race and racism, sex education, and LGBTQ topics.
It’s a core mission of the public library to provide quality information from a diversity of viewpoints, especially on these kinds of topics. Moreover, book bans and undue challenges are attacks on the open access and free inquiry that are essential to our democracy.
The Freedom to Read is a constitutional right that we should act to protect, in the event that such challenges do occur at the Austin Public Library.
The Austin City Council can take a stand to support libraries and oppose book banning by voting to support “Freedom to Read.” I hope they will do that on Thursday.
photo credit: Jonathan Cutrer
]]>I originally developed this website with the Jekyll static page generator running under Ruby 2.7, which is quite old. Yesterday, I decided it was time to upgrade to Ruby 3.0.
That’s great, but once I did the jekyll command started failing. Yikes!
$ bundle exec jekyll serve
/home/courtney/Workspace/website-crosenthal-com/vendor/bundle/ruby/3.0.0/gems/kramdown-1.17.0/lib/kramdown/parser/html.rb:10: \
in `require': cannot load such file -- rexml/parsers/baseparser (LoadError)
There are known problems with Jekyll and Ruby 3.0. The Jekyll Quickstart directions tell you to install the webrick gem. That, unfortunately, was not sufficient to solve my problem.
I also needed the rexml gem. I ran:
$ bundle add rexml webrick
Fetching gem metadata from https://rubygems.org/...........
Resolving dependencies...
Fetching gem metadata from https://rubygems.org/...........
Resolving dependencies...
.
.
.
Installing webrick 1.7.0
Installing rexml 3.2.5
With that change, everything ran great:
$ bundle exec jekyll serve
Configuration file: /home/courtney/Workspace/website-crosenthal-com/_config.yml
Source: /home/courtney/Workspace/website-crosenthal-com
Destination: /home/courtney/Workspace/website-crosenthal-com/_site
Incremental build: disabled. Enable with --incremental
Generating...
Jekyll Feed: Generating feed for posts
AutoPages: Disabled/Not configured in site.config.
Pagination: Complete, processed 1 pagination page(s)
done in 1.039 seconds.
Auto-regeneration: enabled for '/home/courtney/Workspace/website-crosenthal-com'
Server address: http://0.0.0.0:4000/
LiveReload address: http://0.0.0.0:35729
Server running... press ctrl-c to stop.
I like Jekyll. It combines the power of a content management system (such as Wordpress) with the simplicity, security, and performance of a static hosted website. I write my posts in the simple Markdown format, and Jekyll processes them to HTML for publication.
Initially, I set up a fully automated CI/CD system for generating and publishing this website. It used a bespoke Docker image that ran Jekyll in a Jenkins automation server pipeline.
So when I make changes and pushed them into Git source control, the Jenkins pipeline would launch, rebuild the site content, and publish it to the live site. It was très fancy.
But, it was quite complex and brittle. For instance, the Jenkinsfile had to copy the files in and out of a Docker volume, so the container could access it.
It was way too much overhead and complexity to publish my simple site. So I switched to a simple, manual tooling.
I replaced the complex CI/CD pipeline with a simple Rakefile.
A Rakefile is a set of rules for the Rake build utility. Rake is closely associated with the Ruby programming language and Jekyll is implemented in Ruby, thus they pair well.
Here is my entire Rakefile:
desc "Serve site locally"
task :serve do
sh "bundle exec jekyll serve --drafts"
end
desc "Rebuild site static content"
task :build do
sh "bundle exec jekyll build"
end
desc "Publish site static content to web server"
task :publish => :build do
sh "rsync -av --delete --checksum _site/. courtney@www.crosenthal.com:Website/htdocs/."
end
The two tasks I directly use are: serve and publish.
The serve task launches a local web server at http://localhost:4000/, which contains the draft version of the full website. As the content is edited, the website served is automatically updated. (Restart is required only when the configuration changes.) That way I can edit the content locally and view the results immediately in a web browser.
Here is a sample run of the serve task:
$ rake serve
bundle exec jekyll serve --drafts
Configuration file: /home/courtney/Workspace/website-crosenthal-com/_config.yml
Source: /home/courtney/Workspace/website-crosenthal-com
Destination: /home/courtney/Workspace/website-crosenthal-com/_site
Incremental build: disabled. Enable with --incremental
Generating...
Jekyll Feed: Generating feed for posts
AutoPages: Disabled/Not configured in site.config.
Pagination: Complete, processed 1 pagination page(s)
done in 1.018 seconds.
Auto-regeneration: enabled for '/home/courtney/Workspace/website-crosenthal-com'
LiveReload address: http://0.0.0.0:35729
Server address: http://0.0.0.0:4000/
Server running... press ctrl-c to stop.
LiveReload: Browser connected
The publish task generates the full website content and then pushes it live to the web server. The publish task has a dependency on the build task, which does the bit about generating the website content. Then the publish task continues, using the Linux rsync command to efficiently push the changes to the live site.
Here is a sample run of the publish task:
/usr/bin/ruby /usr/local/bin/rake publish
bundle exec jekyll build
Configuration file: /home/courtney/Workspace/website-crosenthal-com/_config.yml
Source: /home/courtney/Workspace/website-crosenthal-com
Destination: /home/courtney/Workspace/website-crosenthal-com/_site
Incremental build: disabled. Enable with --incremental
Generating...
Jekyll Feed: Generating feed for posts
AutoPages: Disabled/Not configured in site.config.
Pagination: Complete, processed 1 pagination page(s)
done in 1.502 seconds.
Auto-regeneration: disabled. Use --watch to enable.
rsync -av --delete --checksum _site/. courtney@www.crosenthal.com:Website/htdocs/.
sending incremental file list
.
.
.
sent 5,212 bytes received 849 bytes 1,346.89 bytes/sec
total size is 9,145,652 speedup is 1,508.93
Process finished with exit code 0
I use an IDE to edit and manage my website. I use IntelliJ IDEA by Jetbrains.
Even though IntelliJ is intended for Java development, it works fine for this purpose. It has a plugin that understands the Markdown format, which is how I author blog posts. Plus, IntelliJ knows how to run rake tasks, which allows me to integrate the Rakefile. It’s available in a free community edition, which supports all the features I need for my website.
I setup two run configurations, one to serve the website locally (called “rake serve”) and another to publish updates once changes are done (called “rake publish”), which run the corresponding Rake tasks. The screenshot below (click to view full size) shows the setup of the “rake serve” run configuration.
The “rake serve” run configuration launches a local web server at http://localhost:4000/. It serves the draft version of the full website. I can browse the website as I edit. The website is automatically updated as I make changes.
To start it, I do: Run -> Run … -> Rake Serve
The screenshot below (click to view full size) shows the startup.
The configuration of the rake publish run configuration is the same as for rake serve (see previous screenshot) – just change both instances of “serve” to “publish”.
Once I’m done with my website changes and I’m ready to publish, I do:
That’s it – all very manual and simple. I tried the fully automated way, but I like this better. The combination of a simple Rakefile and my IDE make it easy to edit and publish my website.
]]>It’s called wipeout. It’s posted here: https://github.com/courtney-rosenthal/wipeout
The tool has a few really nice features.
First, it’s hella fast. The conventional way to wipe a hard drive is to repeatedly write over it – and that can easily take half a day to complete. wipeout uses the ATA Secure Erase feature (supported by most modern drives) to perform the wipe in just a minute or two.
Second, it uses automatic device discovery. You don’t need to tell it device paths or anything. Just start wipeout. Then plug in the hard drive. It will find the drive and prompt you to confirm the wipe.
Finally, it certifies the wipe. When the wipe completes, it does a data readback to ensure everything it’s all zeroes.
The tool is released into the public domain. Give it a try.
Disclaimer: use at your own risk!
photo credit: https://toolbox.easeus.com/hdd-wipe/destroy-old-hard-drives.html
]]>Bad code comments are bad. Bad comments are worse than no comments.
Bad code comments generally fall into two types.
One type of bad code comment is a wrong comment. This kind of comment makes a statement that is factually incorrect. Like this:
# Calculate the circumference
result = pi * radius^2
(πr² is the area of a circle, not its circumference.)
Sometimes wrong comments are just due to a programmer misunderstanding or mistake. Frequently, they can result from changes to existing code that render the comment incorrect. (So, kids, when fixing or changing existing code, be sure to update the comments too.)
Another type of bad code comment is the unhelpful comment. Like a bad play-by-plan announcer, this comment uselessly explains what you already plainly see.
# Calculate the result
result = pi * radius^2
These kinds of comments often are authored by inexperienced coders. They’ve heard that code comments are important (true!) so (by gum!) they’re going to write a comment.
Comments should be used for explaining things that are not clear from a simple reading of the code. If you can make the code clearer, maybe you don’t even need a comment. In the examples above, I’d consider changing the “result” variable to “area”, giving:
area = pi * radius^2
There, no code comment needed – unless it would be helpful to explain why the area is being calculated. Otherwise, comment not required.
Sometimes no comment is the best comment.
]]>The resolutions, as amended and adopted, call for city staff to create two reports on these two matters. Here are the resolutions, as finally approved:
The reports are due to be completed by September. There currently is no plan for public presentation; they would be communicated by memos to the City Council.
Given the public interest (and heightened media interest) in these matters, I believe they deserve public visibility. One way to get that is for a city commission to request a briefing from staff.
On the evening of June 8, I attended the meeting of the City of Austin Community Technology and Telecommunications Commission. The meeting had time allocated for “public communications”, where members of the public can raise issues for the commission.
I appeared at this meeting and made a statement, requesting a public presentation on these two reports.
Here is my statement:
]]>Thank you commissioners. My name is Courtney Rosenthal. My pronouns are she/her/hers. So you know, I served on this commission for 12 years, up until 2014. I currently serve on the Library Commission, although my appearance tonight is unrelated to that.
I wish to let you know that there is an opportunity ahead that could benefit from your involvement. This relates to two reports being prepared by city staff, concerning cryptocurrencies and blockchain technology.
Back in March, the City Council adopted two resolutions that direct city staff to prepare reports on these matters.
There was a lot of public interest in these issues at that time. They generated significant media coverage, both local and national. One KXAN news story framed it as Austin considering whether it should become a “crypto city.”
Now that the reports are coming due, I think there should be an opportunity for the community to receive the information in them. Unfortunately, no public presentation is planned. They just would be memos to council. Given the interest and potential impact of these issues, I think there should be public visibility into the reports.
Therefore, I respectfully propose that you request a briefing from city staff on these reports and their findings.
I think this commission is especially well suited to receive this briefing. You are used to evaluating new technologies and how they might benefit our residents and our city.
Moreover, you are especially adept at doing so through an equity lens.
One reason I’m concerned is because should Austin choose to accept payments in the form of cryptocurrency, it will send a message to residents that it’s ok to go “all in” on crypto. As somebody who bought a small amount of crypto back in March and watched my portfolio shed 40% of its original value, I believe that would be harmful.
Another key point is that the blockchain resolution calls for exploring other technologies – unrelated to blockchain – that may serve the public good, such as public banking. I look forward to how the report addresses these opportunities.
A public presentation on these two reports would serve the community and would benefit from your consideration. That’s why I hope you’ll request a briefing from staff on these reports. Thank you.
At the upcoming March 24 Austin City Council meeting, the council is considering two very bad resolutions relating to cryptocurrencies and blockchain. Both proposals should be rejected.
It is not technophobic to be concerned about these proposals. Many technology experts are skeptical of blockchain and cryptocurrencies. The exuberant support for them tends to come from people who have a vested interest in crypto.
I oppose them and I’m totally not technophobic. I currently am a principal software engineer for a multi-national company in the broadband telecommunications sector. I helped found the Code for America Brigade in Austin. I am a past member and chair of the Austin Community Technology and Telecommunications Commission.
Tech is my thing. Yet, I think both resolutions are bad for Austin and should be rejected.
Here are three reasons why.
The City of Austin is committed to a strong response to climate change and the impending environmental catstrophe. Austin has instituted a strong Climate Equity Plan to address this.
Blockchain technologies, however, are an environmental disaster.
The US Energy Star program notes:
Buildings used to house cryptocurrency mining can create a massive strain on local electricity grids, with a single crypto transaction consuming more energy than that required to power 6 houses for a day in the U.S. The estimated global annual energy consumption of the current cryptocurrency market is over 68 TWh, equivalent to more than 19 coal fired power plants operating continuously. Due to the technical nature of blockchain, this number is projected to grow to 100 TWh annually.
[source]
The City of Austin cannot be committed both to a strong climate program and blockchain technology. They are mutually exclusive. Please support environmental concerns and reject these resolutions.
It would be irresponsible for the City of Austin to put our tax dollars and other city revenues into cryptocurrency holdings like Bitcoin. City funds should not be placed into speculation.
At the moment, cryptocurrencies are riding high on a speculative bubble. When (not if) that breaks, a lot of people will be hurt. If the city moves into cryptocurrencies, residents may take that as an endorsement of cryptocurrencies and signal safety. That would make a terrible situation even worse in the event of a cryptocurrency crash.
Sound financial instruments have institutional oversight to promote the stability of these assets – like the Federal Reserve and the Securities and Exchange Commission. Not only do cryptocurrencies lack any institutional support, they are founded on principles that reject such controls. Thus cryptocurrencies present substantial financial risk.
It would be irresponsible for the City of Austin to take a position in cryptocurrencies at this time.
We don’t have to look far for examples of these risks. Soon after recently elected New York Mayor Eric Adams chose to take his salary in cryptocurrency, there was a major drop, and he may have lost around $1,000. [source]
Please don’t expose Austin residents or city revenue to this risk. The city should neither use nor actively support cryptocurrencites.
The US National Institute of Technology (NIST) issued a recent Blockchain Technology Overview report.
It describes blockchain as:
Blockchains are tamper evident and tamper resistant digital ledgers implemented in a distributed fashion (i.e., without a central repository) and usually without a central authority (i.e., a bank, company, or government).
This raises the question why a local government, one of the soundest and most trusted of institutions, needs a ledger that runs autonomously without a central authority.
The answer is: typically it doesn’t. Trusted record keeping can be done by the city or its designates.
Further, blockchain applications are limited. Blockchains typically store ledger entries, not content.
The biggest problem of storing data on a blockchain is the amount of data you can store. This is either because the amount is limited by the protocol or because of the huge transaction fees you would have to pay.
[source]
For instance, when you acquire a non-fungible token (NFT), the content (graphic, audio, whatever) still resides on the conventional web. The blockchain just holds a web link in the ledger. A trusted authority (like the City of Austin) doesn’t need blockchain to do this. It can just publish the ledger itself.
The NIST report recognizes that once the hype recedes, there could be suitable uses for blockchain. The city certainly should consider all sound technologies. But it can do that without these resolutions. These resolutions only serve to fuel the current hype cycle. They should be rejected.
Both resolutions currently before the council should be rejected. They are bad for the city, bad for residents, and won’t serve to foster civic innovation. Please vote no.
Respectfully yours,
Courtney Rosenthal
District 7 resident
]]>March 24 update: Today the Austin City Council deliberated on these two resolutions. They were amended to address concerns raised by the community, including the ones I discussed here. The amended resolutions were approved.
Just like TVs, computer monitors have been packing in more great tech at lower price points. These days, mid-range monitors deliver excellent performance at affordable prices.
But when you go monitor shopping you’ll find a bewildering number of choices. The B&H Photo website lists a staggering 561 different computer monitors in stock. Where to begin?
Maybe I can help – at least if you’re looking for a monitor for routine office work. We can make a few quick and easy decisions that will reduce this to a more manageable set of choices.
In the end I heavily leaned on reviews, but going through this exercise helped me understand the choices a lot better.
I had a few key parameters for my purchase.
First and foremost, I wanted a monitor for routine office work. I was not looking for gaming, media editing, or graphics production. If you’re looking for one of those then stop here – my suggestions won’t help.
Also, I buy for value. When it comes to computer tech, I prefer the best I can have at a reasonable price point. I don’t want either high-end performance nor low-budget prices. I shoot for the middle, at the sweet spot of current market availability.
Finally, desk space is a consideration for me. I have rather limited desk space. I’d love a huge, curved-screen monitor, but that wasn’t going to be in the cards.
So, let’s narrow down the pool of choices, and dive into decisions for some important parameters.
Recommendation: Pick IPS.
There are numerous display panel technologies available. Currently, the most common are:
IPS (In-Plane Switching) currently is the best all-around technology for monitor display panels.
TN (Twisted Nematic) and VA (Vertical Alignment) are older technologies. They have some limited advantages for gaming or media viewing, but a lot of drawbacks. IPS is going to be a better choice, particularly for routine office work.
OLED is baller technology and freaking expensive. Save that for your high-end workstation.
The chart to the right breaks down the current offerings at B&H Photo.
It shows that the current market sweet spot is IPS panels. With 718 models available, that’s about four times that for the next most common VA panel.
For more information about monitor panel types, here’s is an article from monitor manufacturer ViewSonic that compares various panel types.
Recommendation: 27” for me, but your mileage may vary.
I made this decision on two points: available desk space and current market sweet spot.
The chart below breaks down the current offerings at B&H Photo. (It is page 2 of 3, which shows the bulk of available size selections.)
27” clearly is the current sweet spot, with 336 entries.
The next common step up is around 32”. The next page of monitor size listings (page 3) shows about half as many available monitors around 32”, so that’s not going to be the best value choice right now.
You might choose to pay for the upgrade to the larger display, but it is a considerable price step. There’s a 60-100% price premium for a 32” model over the 27” model in a given product line. A dual monitor setup might be a better value than a single 32” display right now.
But given that desk space is a consideration for me, I had to go for 27”. I find that 27” isn’t quite big enough to display two windows fully side-by-side. I have to arrange them with about 33% overlap. Not ideal, but workable.
Workable and affordable is good enough for me, so I’ll take a 27” monitor.
Recommendation: 2560x1440 (QHD) for 27” monitor
Here are some monitor resolutions commonly available today:
+------+--------------------+-------+ | name | resolution | lines | +------+--------------------+-------+ | FHD | 1920 x 1080 pixels | 1080p | | QHD | 2560 x 1440 pixels | 1440p | | 4K | 3840 x 2160 pixels | 2160p | +------+--------------------+-------+
(Pixels are the individual dots that you can see when you get your snout right up in the screen. Lines are the number of scan lines – horizontal rows of pixels – on the screen.)
Ideal resolution is the point at which the pixels perfectly smooth together. At lower resolution you begin to see the individual dots. At higher resolution the items display to small and you may need to scale up.
Ideal resolution is determined by two factors: viewing distance and pixel density. Viewing is the distance from your eyes to the plane of the display. Pixel density is the number of pixels packed into a square inch.
This article is helpful for understanding all these details: https://www.displayninja.com/what-is-pixel-density/
It suggests best viewing experience comes at 110 pixels-per-inch (PPI).
For a 27” monitor, 2560 x 1440 (QHD) resolution provides a 109 PPI, which is pretty much right on this mark.
For a 32” monitor, I’d consider going up to 3840 x 2160 (4K) resolution. But I haven’t verified this, so don’t take my word for it.
Recommendation: only if you can use it
The last choice is: do you want laptop docking support? This is a matter of preference and whether the feature is worth the added cost to you.
Laptops always have had USB connectors, which serve a multitude of purposes. They can connect all the usual USB devices: keyboard, mouse, speakers, wired Ethernet.
Modern laptops have USB-C connectors with two new extremely important capabilities. You can connect a video display here. Also, you can use it to power the laptop.
This means with a single USB-C cable, your monitor can be a laptop docking station. When I compared similar monitors differing only in laptop docking support, the cost difference was about $50.
So, if this feature interests you, go for it. If you do, first make sure the monitor USB-C connection is capable of delivering sufficient power to run your laptop.
If the feature isn’t worth the added cost for you, then feel free to take a pass.
The selections discussed above reduce the pool of choices from 561 to a much more manageable 29. I started looking into reviews of those 29 to make my final decision.
I purchased the aforementioned ASUS PA278QV for $319.
The PA278CV adds USB-C laptop docking support for an additional $50, and is the Wirecutter favored pick. I didn’t need the feature, so I opted to save the $50.
I’ve had the monitor for a few weeks now and I’m happy with it. I’m using it on my main Linux workstation (an old Dell Optiplex 9010) driven by Intel integrated graphics.
The only problem I encountered is that initially I only got 1200p video instead of QHD 1440p. The problem was my old DVI video cable. I upgraded it to DisplayPort and now everything is copacetic.
]]>