Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Absolutely agree. The single master branch, commit when you feel like it workflow is very simple, and still miles better than just copying files in the file system.


But, who cares? If this worked for the developer, what's wrong?


He should care. The effort to learn git would have saved him time. Time he could have used to improve DF or to sleep, whatever. But instead he's still moving folders around.

I believe it's a balance. You don't want to spend your life just adjusting your IDE settings to the point that you never work on your actual project. But you shouldn't completely neglect anything that could improve your life with little effort.

I confess I'm guilty of "procrastination by configuration". I probably spent a year of my total career life just messing with my settings or getting the perfect git or vim configfiles.


I think you may be overestimating the amount of time and effort it takes to move folders on a modern computer.


I don't disagree he needs a source code control system. But it's his intellectual property and right now he has some protection through redundancy. At least I certainly hope he keeps some USB sticks with a friend because if not one house fire and his livelyhood is gone.

If he goes online he has to worry about someone hacking his account, if he keeps it at home we're back to what is his backup strategy, what does his redundancy model look like, and that's just not losing anything he already has. Next he needs to git's enough to be effective. Not that it's hard but if you've never used any sccm before it can be intimidating, and all of it takes time.

It may just be easier to say later, later, later because you don't want to take however long it'll be to move and you're thinking of all the lost development time. Of course if he ever really needs it (house fire, tornado, hurricane) the price will the seem quite cheap.

In the meantime I'd at least find a way to backup off-site even if it's scp'ing passworded zip files to an Azure VM.


> But, who cares? If this worked for the developer, what's wrong?

Chances are it didn't work for the developer. I've been in places where people were versioning by folder, and pretending it was fine, because they didn't want to learn how to do it properly. They'd come up will all manner of excuses that git/svn/hg/etc had already dealt with. "It's easier to just copy paste", "It's a waste of my time to spend time on a VCS", "What if we both change the same file". I've heard them all, and when I look back at the colleagues who said this, they produced a lot less than they ought to have. Not just because of version control, it's one among several software red flags (never commenting anything, not having tests, not making an effort to make things work on other people's machines, etc).

I'm not going to say anything bad about the DF guy, this seems like a strange foible, but in general not using version control is a major red flag that someone is actually a bit of a novice who is low on productivity.

Version control is the big one, because if you're productive you tend to write a lot of code, and you'll tend to have hit the same issues that everyone has when they're writing a lot of stuff.


> I'm not going to say anything bad about the DF guy, this seems like a strange foible, but in general not using version control is a major red flag that someone is actually a bit of a novice who is low on productivity.

Sure, not using version control in an environment where collaboration is to be expected (like in teams/companies where you work with others), I'd agree with you. But that's not Dwarf Fortress, it's just a single person doing the work. Obviously Tarn is not "low on productivity" or "a bit of a novice" since he have done something many of us will never do in our entire life, create entertainment that will forever live in some peoples memories. Just because he's not using Git, doesn't mean it's a red flag, since using Git would basically give him zero benefits and just add more to learn for again, zero benefits.


It depends whether the process is part of the art itself, or just the result. If someone wants to be the excel artist (https://www.spoon-tamago.com/2013/05/28/tatsuo-horiuchi-exce...), that's cool even if other tools would give them more possibilities. But in this case, the 80s development style is neither part of the art, nor is it better in any way at all, so people keep pointing out that this decision is just objectively worse than doing the absolute minimum of modern version control.


I get your point, but it's also not entirely accurate to say the workflow and tools aren't part of the art.

Creativity is a fickle thing, and it's unlikely that steps X,Y,Z would produce the same inspiration as A,B,C.

I agree that modern version control would offer benefits, but maybe in moving to git Tarn would decide to sha1 everything internally, which would have some ultimate impact on the game that results.


> not entirely accurate to say the workflow and tools aren't part of the art.

That workflow/tools is not something they advertise, are proud of, or concentrate on. We know about it from a random interview only, so I don't see a reason to count that as part of the DF-as-art.


I think you're talking about the artistic _product_, and the ethernet bridge is talking about artistic _process_. Like, you don't care what kind of paintbrush a painter uses, but it matters a lot to them.


* Free sync across multiple devices

* Ability to roll back to some older state. often I have dismissed some solution only to later realize that I need it again, or parts of it. If I have ever committed it, it's still around. Even if it is an orphan commit now, or in the extreme case, if it was once in the staging area you can still recover it later, at least until garbage collection has deleted it.

But whatever rocks your boat I guess.


Also the ability to see when what feature of the code was introduced.

That can help a lot with understanding code bases: people seldom comment their code properly, so any context you can get is good.

(In my opinion, really good code can often answer 'what' and 'how' without comments, but it has a hard time answer 'why', and an almost impossible task of answering 'why not'? Ie 'why not' this alternative approach?

Whether the answer to that one is "we tried it and it didn't work" or "we didn't think of it, you are a genius for bringing it up" or something else can be important.)


There is barely a week that goes by that I don’t start off with a good idea but halfway through I realize I’ve taken a wrong turn or bitten off the wrong chunk first. If it weren’t for git or more frequently Jetbrain’s local history, my output would be lower quality.

And what the Mikado Method teaches us is that when a refactoring gets out of hand, there’s a much smaller refactor hiding inside it waiting to get out. Erosion history is both a blessing and a curse here, but generally there are bits you can salvage without falling into Sunk Cost.


Prevents collaboration with others, ability to rollback to a specific release to test a bug, partially developing multiple ideas.

All of that is possible manually, but it is doing version control by hand. There are plenty of great tools out there to simplify that process. And if you start automating your manual custom process, you just end up at your own version control system that is something additional you have to mantain.


Seems like the some of the best work is done in thinking during menial tasks moreso than during actual coding time!


Nothing until a catastrophe causes lost work, or if they wanted to collaborate with someone else.

In fact, right now the main DF developers are working with a partner studio to put a better UX on DF to sell on Steam. I bet an industry standard version control would be extremely helpful right about now.


I don’t doubt that the lack of VC is causing that project’s delayed release. It was announced nearly three years ago.


git blame so the developer can remember why they did something 15 years ago.


So you mean svn?


Yes, svn would support this use case.

If I remember right, svn used to be a bit more involved than eg git for just starting tracking of a local project. I think you used to need to set up a (local) server etc?

They seemed to have fixed that, so the barrier to entry is lower than it used to be.

Almost any version control is better than manual mucking around with files.

Git is probably the better investment, if you are new to version control systems; if even just because everyone else uses it these days.

But svn would work ok-ish, too. And you can always convert your history later, if necessary.


Subversion doesn't need a local server, a directory for the repo will do.

In fact, the only reason I use git instead of subversion is that I am required to do so.

Most people use git as subversion anyway, and when they mess the local development, re-create the local development.

No one wants to take a PhD in git command line flags.


Maybe people are wired differently but I have always thought that git is way more intuitive than Subversion and I was more productive after 1 week of git than I previously had been after 1 year of using Subversion. I am sure Subversion has improved since then, but so has git.


It would be really interesting to learn what git users think is less intuitive in Subversion than in git.


Quite a lot.

* Syntax - in svn different commands use different conventions on how to use revision

* In svn moving directories breaks "subdirectory as branch" abstraction. I.e. there are no actual branches. Only files and versions of them.

* "Svn up" can fail, leaving working copy in broken, unpredictable state. This breaks abstraction, revealing that svn is little but copying files around. Git forces you to save your work and only then merge. That doesn't fail even when it fails to merge cleanly.


* ahem. Revision is always a revision in SVN. It’s rN and all commands accept it.

* ahem. All branches in SVN are copies of an existing tree at a particular revision. Call it a branch or whatever you want. But if you make a copy of your main development tree, you make a branch and you can work on it in the same fashion as if it was git. Merge it back to trunk when done or continue your work and sync with trunk or other branches. The principle is the same.

* Failing “svn up” is an unexpected technical problem that occurs due to problems with the server or network. The problem should never happen in normal cases.


First point: Here is an official svn doc with lots of mentioning of @revision

https://edoras.sdsu.edu/doc/svn-book-html-chunk/svn.ref.svn....

Two styles of revisions is super confusing to users like me. Git confuses me a bit, but not in command syntax. Hope this illustration helps to see what I mean.


Maybe a more up to date version of the doc will help: https://svnbook.red-bean.com/nightly/en/svn.advanced.pegrevs...


To be honest, Git is pretty good about dealing with 'unexpected technical problems'.

It even handles running multiple instances of git commands at the same time rather well.


The xkcd carton is still quite up to date.


>No one wants to take a PhD in git command line flags.

Yep that's why i still use bitkeeper....and it's great :)


You forgot to mention the "click on this one specific checkbox or your source code is now leaked to the entire internet forever" part. So actually, not that simple and potentially much much worse.


> You forgot to mention the "click on this one specific checkbox or your source code is now leaked to the entire internet forever" part.

Git != Github

Just using git on a local machine is nothing like this, unless there's something large I'm missing.


This is just "copying to a flash drive" with extra steps.


Except you also get a better history (assuming you write good commit messages), and it's all in the same folder.

You could also use git-send-email to make a mailing "list" with just yourself, and that way make it much easier to move between machines as you work. Or if you don't trust your email hosting provider, self host gitea or sourcehut (or have the option to later).

Finally, you also get the ability to better organize development with branches.

I don't even know git that well and came up with that off the top of my head.


Git doesn't require an internet connection. With git you can keep doing backups the same way and store the git repository in flash drives, Dropbox, Amazon Glacier, or anywhere you want. The main difference here is that you don't need one folder per version anymore.


Git has no checkboxes. It is purely command line. Are you talking about Github or Gitlab? I know Gitlab used to have this bug where you could easily accidentally give everyone access to all your repos.


No fully true, there is gitk and git-gui (both part of the full-git-package)...but yeah no "publish to internet" checkbox there ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: