Stacked PRs are a really natural fit for vibe coding workflows, it helps turn illegible 10k+ line PRs into manageable chunks that you can review independently. (Not affiliated with Cursor or Graphite)
My biggest frustration is the lack of a good universal REPL to just play around with. It's frustrating how I have to run `uvx --with x,y,z ipython` every single time I just want to spin up some python code which may or may not use packages. (Hard to overstate how annoying it is to type out the modules list).
To me, Python's best feature is the ability to quickly experiment without a second thought. Conda is nice since it keeps everything installed globally so I can just run `python` or iPython/Jupyter anywhere and know I won't have to reinstall everything every single time.
Would creating a `main.py` with the dependencies installed either as a uv project or inline work for you?
One thing I did recently was create a one-off script with functions to exercise a piece of equipment connected to the PC via USB, and pass that to my coworkers. I created a `main.py` and uv add'ed the library. Then when I wanted to use the script in the REPL, I just did `uv run python -i main.py`.
This let me just call functions I defined in there, like `set_led_on_equipment(led='green', on=True)` directly in the REPL, rather than having to modify the script body and re-run it every time.
Edit: another idea that I just had is to use just[0] and modify your justfile accordingly, e.g. `just pything` and in your justfile, `pything` target is actually `uv run --with x,y,z ipython`
Edit edit: I guess the above doesn't even require just, it could be a command alias or something, I probably am overengineering that lol.
From a process perspective, how can a constituent know with absolute certainty that their vote was counted, every voter in the system was legal, and the final tally was authentic? Especially when there's no way to even audit what you voted for after the fact?
Every time I try to get to the bottom of this, it always boils down to "trust the system" which makes me uneasy.
Not being able to audit what you voted for after the fact is by design. Otherwise, it would make buying votes a viable strategy since you'd be able to show them who you voted for. Yes, taking a picture of the ballot is an option, but you can always ask for another ballot paper after you take the photo. Where I live, you're not even allowed to have a camera out in the same room as a voting booth for this exact reason.
IMO the best solution here is to have electronic counting with an auditable and traceable paper trail as a backup. Every time I've voted for the past 10 years has been like this. First, I get a ballot paper from the front desk and stick it into an airgapped ballot marking machine. I then make my choices and the machine prints them onto the ballot paper. I'm able to read the paper and verify that it matches the choices I made. I then stick it into a separate airgapped ballot counting machine, which scans my ballot and deposits the paper copy into a sealed box. The entire process of setting up the machines, transporting the paper ballots, and reading the results from the machines is cross-checked and signed off on by volunteer poll workers from both parties.
Each polling station should have representatives from multiple parties as well as independent observers.
> how can a constituent know with absolute certainty that their vote was counted
The representative of your party plus independent observer said all votes at your polling station were counted. You know both those community members and know them to be generally honorable. Ergo your vote was counted.
> every voter in the system was legal
None of the observers at the polling station, or the station head claimed any illegal person voted.
> the final tally was authentic
The observers all signed as witnesses on the final tally.
This is not the "system. it is humans you know who are telling you what they saw. If you can't trust other humans at their word, democracy cannot fundamentally work.
You should trust political volunteers after you have seen their track record of being honest and truthful. (Though there is some default amount of trust the process gets because of the adversarial nature of volunteers with opposing biases checking the process).
This is along the same vein as
You should trust candidates for the seat after you have done your due diligence that they have honest and truthful, and will faithfully represent you in the legislature/administration.
as well as
You should trust civil servants to have done state activities justly and produced truthful records and reports of state activities after you have seen a record of them doing these things correctly over time.
Democracy with humans is built on a lot of trust in humans. We have to keep this in mind when arguing about these things.
You do not have to watch every district, every election, every time. But given that enough people do it, at least once, at least in their own district, then it is easy to see why the system as a whole is trustworthy.
I think the sentiment of the OP actually gets to the heart of this (the idea of open-source is transparency, visibility, auditability) but the problem here is it need to be applied to the actual process, not to the process of building tools for the actual process.
It's not that developing voting software should be open-source, its that actual voting should be "open-source" in the physical sense.
Trusting the system is possible if you can (you, yourself) readily observe every part of the system. I don't think giving members of the public access to the server your voting software is hosted on is a very viable idea, but giving members of the public access to paper count centres is (it's done very successfully in many countries).
I put a lot of thought in to my prompts. I can code, definitely not as good as the AI or people here on HN; it's something I always enjoyed doing to tinker with things.
The AI agent in Cursor with Gemini (I'm semi-new to all of this) is legit.
I can try things out and see for myself and get new ideas for things. Mostly I just ask it to do things, it does it; for specific things I just highlight it in the editor and say "Do it this way instead" or "for every entry in the loop, add to variable global_var only if the string matches ./cfg/strings.json" I _KNOW_ I can code that.
>AI-generated code needs to be reviewed, which means the natural bottleneck on all of this is how fast I can review the results
I also fire off tons of parallel agents, and review is hands down the biggest bottleneck.
I built an OSS code review tool designed for reviewing parallel PRs, and way faster than looking at PRs on Github: https://github.com/areibman/bottleneck
I think people will hate it somewhat because the older style and assets mixed with more modern lighting and textures (moreso, often AI upscaled ones), will look... kinda shiny and overall a bit off, same as happened with the GTA remaster, other launch issues aside.
That said, I am still in the minority that enjoys such attempts, if nothing else, then because at least these modern versions often run a bit better, with proper high resolutions and widescreen support, as well as sometimes receive some quality of life fixes to bring the game up to speed and make it play like something a bit more modern - the jank of early 2000s games is something I don't enjoy.
At the same time, there's no reason why a mod made by passionate members of the community couldn't do more or less the same, except often times a bit better, which is jarring - how some companies seem to do the equivalent of outsourcing it and try to produce something quickly and on a budget.
The sales numbers show that you're not in minority for enjoying remasters.
There's a very loud cacophony from a minority of fans that get really really angry because someone touched their favorite thing and moved some rivets around. Similar to the effect where people were VERY VERY angry about the changes and omissions LotR movies did to LotR books.
"Moving rivets around" is how I might describe the recent SS2 remaster from nightdive, it was pretty good. This DX remaster is more like "let's have the cheapest contractors we can find run this venerated classic through an AI upscaler and charge 30 bucks for it". Notice the sign on the wall in the unatco break room that says "Stratigies" in the remaster trailer. DX deserves far better than that
On the bright side, Deus Ex: Human Revolution, while largely a different dev team was a fantastic game and scratched the itch for me when it came out. One of the few games in the last 15 years I’ve actually played to completion.
Eh. It didn't really feel like Deus Ex to me, spiritually. I think Prey (2017) hits way closer. I haven't played the System Shocks, so I don't know if they were aiming for that or for Deus Ex. I've heard some people from Looking Glass went to Arkane. It seems like there's only 5-10 people in the world who know how to make immersive sims and keep making action RPGs where you can upgrade your body and have quests with multiple viable approaches.
It had some cool mechanics but was pretty meh on the storyline. I didn't really feel a connection to the characters like I felt with the original Deus Ex. Except maybe copter pilot girl
If everyone who liked the original goes out and buys a remaster, then hates it, aren't they still counted as a sale? It's like a movie ticket, you don't know if the movie will be any good when you pay for it. I don't see sales numbers as being linked to quality or enjoyment.
Why would anyone do that? I would assume people can just read a review first before buying. Especially since they already played the original, so it isn’t like they would need to worry about spoiling the story for themselves.
Also, easy Steam refunds if the game was played for under 2 hours is a thing.
I felt like outside of bug fixes and QoL improvements, Revision's maps and soundtrack were a downgrade compared to the original. I feel quite strongly about this, but I recognize it's also largely up to taste and it's valid to like those changes.
But for me, they made the experience worse and they were enabled by default. As a result, I gave it a thumbs down on Steam and did my best to explain my thoughts in detail. I received a couple dozen negative comments over the years on that review, largely in the vein of how dare I give negative feedback to a labor of love provided for free. That kind of argument did make me feel guilty, like I was being unfair to the developers. I eventually changed it to a positive review, and now I regret doing that. I allowed my genuine opinion to be clouded.
This was an extremely tame internet conflict overall, I'd feel ashamed to frame myself as a victim over so little. What I'm trying to say is that both sides are capable of failing to genuinely engage with the other.
It's definitely true that Revision has been to some degree unfairly attacked. There are purists who do not give it a fair shake and make ludicrously confident statements, peddling opinion as fact. But there's also legitimate reasons to dislike it. Not knowing you, I am not at all accusing you that you'd be lacking nuance on this topic. I'd just like to say as a general statement that discourse ends up healthier when people care about distinguishing between people who disagree with you versus people who disagree with you _and_ that are acting in bad faith.
Remasters generally are worse than originals. There are rare exceptions, but usually you get either mixed bag, or outright worse experience than original. Ditto for remakes.
The System Shock 1 remaster in particular is excellent. It's not just a graphical improvement, they improved the controls and inventory interface and a host of other legacies from from DOS-era gaming that did not age well.
It's well received after launch, yes. There's been plenty of screeching about "not being true to original" and "this looks ugly" and "this looks wrong" on the way there though. Just like for this.
Especially for the first remake.
This is why I say the screeching right now based on a trailer means nothing. "Fans" always get mightily offended if someone touches their childhood favorites.
You don't have to be a fan of the game to see the trailer and think it looks bad. I'm certainly not a Deus Ex fan. I played it just once, years and years ago. Yet when they showed the trailer during State of Play I was shocked at how bad it was. It's clearly just some texture upscaling and an update to the lighting. The original looks ugly and this remaster manages to look uglier.
An example of a remaster done right was Halo 1 (which is actually quite an old remaster at this point). They threw a new graphics engine on top but also remodeled and retextured everything. That's what I expect out of a proper remaster.
While I enjoyed CE:A, it had issues. The big one was the loss of bump-mapped textures making everything look flat. They only show up when you use a flashlight.
Games people love usually have a large portion of spontaneity behind their success. This is hard to faithfully capture even if that's your intention, and remasters are usually done on a tiny budget for profit by people who often aren't even familiar with the OG game, which doesn't really help.
There are exceptions though, and also there are some remasters that are not faithful but are good on their own.
I don't care for Deus Ex, but looking at the screenshots I struggle to tell which one is the remaster. It's very clear that they messed up the lighting and the overall mood though. I'd be offended as a fan.
As a GMDX user your comment and the Remaster are both irrelevant. The mentioned mod makes the 1st Deus Ex perfect. It enhances places, items, location, graphics... without breaking the original gameplay and mood to please Gen-Zers.
When I worked at a Microsoft shop, I used Azure DevOps. To be honest, it's actually not bad for .NET stuff. It fits the .NET development life cycle like Visual Studio fits C#.
Stash (now BitBucket Server) had the best code review going, head and shoulders above GitHub to the point I thought GitHub would obviously adopt their approach. But I imagine Atlassian has now made it slow and useless like they do with all their products and acquisitions.
Stash was not an acquisition. Stash was built from the ground up inside Atlassian during its golden age, by a bunch of engineers who really cared about performance. Though it helped that they didn't have Jira's 'problem' of having 8 figures of revenue hanging off a terrible database schema designed a decade ago.
You might be thinking of Fisheye/Crucible, which were acquisitions, and suffered the traditional fate of being sidelined.
(You are 100% correct that Stash/Bitbucket Server has also been sidelined, but that has everything to do with their cloud SaaS model generating more revenue than selling self-hosted licenses. The last time I used it circa 2024, it was still way faster than Bitbucket Cloud though.)
Source: worked at Atlassian for a long time but left a few years ago.
Yeah I think I was remembering things backwards - since they put Stash under the Bitbucket organisation and branding it looked as if Bitbucket was their own product and Stash the outside acquisition, but it was actually the other way around.
There was a locally-hosted Git server platform called Stash. Atlassian bought it, rebranded it as "BitBucket Server" (positioned similarly to GitHub Enterprise or self-hosted GitLab) and gradually made it look and feel like BitBucket (the cloud product), even though they're actually completely separate codebases (or at least used to be).
which is ironic because historically the slowness of GitHub's UI was due to them not using much JS and requiring round trips for stuff like flagging a checkbox.
I use it every day and don't have any issues with the review system, but to me it's very similar to github. If anything, I miss being able to suggest changes and have people click a button to integrate them as commits.
Commenting feels so much better. You can comment on entire files, and you can leave review comments that actually "block" (rather than just get appended to the conversation)