I think the lesson is not that bucket-o-flash-drives is a better way; I think the lesson is that you can make a not-ideal process work completely fine if you just keep focused on the main point. People made successful software way before version control existed. It just makes it easier but that's all.
Game Development
Welcome to the game development community! This is a place to talk about and post anything related to the field of game development.
Romero talked about how they would just pass floppy disk to each other. It's just bigger disks now.
Linux was written up to about version 2.2 or 2.4 or thereabouts with no version control, just diff and patch and email. They invented git because at a certain point they wanted automated tools to make easier and more automated their way of working (which none of the suitable VCSs of the time were capable to), but it wasn't like they couldn't do the job until the tools existed.
Emailing patches is shockingly similar to git tho
Exactly. They did what we all wish we could do, they took a process that worked for them and automated it.
Thankfully they shared it with the world, and it works for most of us, too.
https://git-scm.com/book/en/v2/Getting-Started-A-Short-History-of-Git
They invented git because bitkeeper withdrew support
Yes. The little word "suitable" was doing a lot of work in my explanation, it's true. BK was invented by a kernel developer for pretty much exactly the reasons I explained; I just glossed over the part of the route that went diff+patch -> BK -> git as a detour. I was actually a lurker on the linux-kernel mailing list while all this was going on and saw the fireworks in real time.
It's not exactly accurate to say BK withdrew support. Larry McVoy was such a pain in the ass about wanting to control how people used his product that the kernel developers felt the need to write their whole own solution as opposed to continue putting up with him. (Specifically, his pulling Andrew Tridgell's license for dubious reasons was the straw that broke the camel's back.) RMS actually wrote a sarcastic open letter thanking McVoy for providing a good lesson in why it's a bad idea to let your important stuff be dependent on proprietary software.
But the point remains; BK was invented by a member of the kernel community specifically because none of the existing solutions were usable for them, after more than 10 years of one of the biggest and most-distributed software projects in the world having no source control whatsoever.
Project zomboid had to start again after flat got burgled and laptop gone. Offsite backups are key for theft and fire. Version control is the easiest and cheapest way.
Someone always knows someone that drinks, smokes and eats crap and lives until mid 90s. Doesn't mean it's good health advice.
Beware anecdotal evidence.
Madness, if you ever have multiple devs touching the same files, this will lead to nightmare scenarios integrating code.
But it has guns.
As someone who inevitably gets thrown into the "devops" side and the like:
The vast majority of developers can't integrate code or even resolve a merge conflict (and god help you if someone convinced the team to do rebasing instead...). They just stop working and then whine three weeks later during a standup that progress is halted on their deliverables. And, because of the stupidity of "devops" as a job role, there is an increasing culture that development should not have to worry about this kind of stuff.
So good project management becomes splitting up files to minimize the chance for conflicts and spreading tasks out to minimize the times people will be in the same file, let alone function. And if they do? Then you do whatever the latest buzz word is for "peer programming".
I will never understand the idea that rebasing inherently causes problems. Rebasing gives a much cleaner history and reduces the number or commits with multiple parents, making it approximate a simple tree rather than a more complex graph.
The simple rule is branches that only you work on can be rebased, shared branches must be merged.
I've never understood the complaints about rebasing. Just make sure you merge if it is complicated
Jokes aside: It honestly isn't THAT much worse. But if you don't "understand" git, you can fuck up your history and it is a real mess to recover from a "failed but technically not" rebase. Whereas merges just result in a shitfest of a history but everything gets reconciled.
Although, a bit of a rant: I do still think rebasing is fundamentally "bad" for long term debugging. As a simple example, let's say that you changed a function signature on branch A and merged it in. Branch B uses that function and started before A. After a rebase, there is no indication that those previous commits would not have worked or were counting on a different signature.
Generally speaking, you can avoid this so long as you always add a merge commit to go with the pull requests (or futz around a bit to identify the known good commits). You assume that those are the only valid commits and move from there. But when you are debugging a bug that silently got added two years ago and think you are clever because you know how git bisect works? You suddenly have a lot of commits that used to work but don't anymore.
It doesn't come up often (especially since so many workflows these days are "throw out the old code and redo it, but right this time") but you only need to run into that mess once or twice to no longer care about how clean your history is.
This feels like a problem I just had a complex enough code base to worry about. I like rebasing because it feels more like I am committing when I intended, but if the deltas were too great it would be a huge issue.
The small more frequent changes not solve this too?
If your project/code base suits itself well to being nothing but small feature branches, sure.
But reality is that you are going to have the "long living feature" branches where it doesn't really make sense to merge any of the code in until "it all works"
The “long lived feature branch” approach is kind of the entire problem that leads to merge hell though. Having long lived branches is at odds with the rebase centric style and that’s intentional. Rebasing incentivises keeping branches small and getting stuff into main as often as possible. Basically it’s about using git in a “trunk” style.
The next question is “what if I don’t want this code to go live yet” to which the usual answer is “feature toggles”
People get very dogmatic about this stuff haha. I’ve worked in teams that were very hype about using rebasing and teams that could easily handle long lived feature branches. The difference is the latter kind of team weren’t all trying to edit the same files at the same time. Hmm. So yeah I guess what works best is situational!
EDIT: I just realised this is a gamedev community whereas the above comment is based on my experience in the “enterprise business factory” dev world. Might be a bit different over here, not sure!
Pure ideologies work until you have tight deliverables. And "feature toggles" become problematic if you need to do a "release" and don't want to have undocumented (possibly unfinished) functionality that is one malformed configuration file away.
At the end of the day, it is about balancing "clean" development with acknowledging thjat you are going to need to cut corners. Generally speaking, "open source" projects can get away with a strong focus on ideology because they don't have deliverables that can mean the difference between being rockstars and being this week's tech layoffs.
Agreed.. also, working with nested ancient feature toggle hell made me miss giant merge PRs haha.
I had no idea git-bisect exists, and we've been doing binary search for broken stuff by hand every time. Thank you for this mention!
We're just in the middle of investigation a performance issue, and this will definitely make it a lot easier.
Lots of times when rebasing you end up needing to resolve the same conflicts over and over again, and very few people know about rerere
Where you are... I've never seen an example of this yet in the UK.
I think I get over their nightmare with the hundreds of millions of dollars they’ve made so far lol
Our last major college project that spanned multiple semesters was worked on by 5 devs all editing the same source files over Dropbox. The school had servers for svn, but no one knew how to do source control. It was exactly the type of shitshow you would expect.
You know what they say, if it's stupid but it works...
I love this so much :D That reads like something I'd expect from ZA/UM, but it also thankfully alleviates most of the major issues I had with the game, which I've already talked about here on Lemmy. I really liked the game, but there was a lot of red flags point to it being just a quick corporate cash grab, where they decided to basically re-skin heir previous game based on with as low effort as possible, to quickly sell it and cash in on the Pokemon thing. It just smelled with corporate greed, and that they did not really cared about the game too much.
But assuming this screenshot is true, I'd say that it's clear that it wasn't development driven and pushed by corporate greed, but really just a few of guys trying their best.
As a craftopia player palworld definitely feels like a bit of a reskin, but one that gives players a lot of what they wanted (mainly being able to explore freely in multiplayer mode which is severely limited in craftopia).
One element palworld leaves out is being able to create your own automated processes (like automating a farm with a series of conveyer belts, chests, and various machines). They say they’re still planning to develop craftopia so I am pretty excited to get the elegance of the pal world pets (which craftopia had too, but not as shiny) and the fun of automating your own homestead instead of setting up prefab stations.
Next big game: Digimon, but with guns
They already have guns, looking at you Gundramon
They didn't already do that? 🤔 Pretty sure I remember a digimon that was like a little cowboy with revolvers.
I don't have a creative vision. I just want to make a game that people like
Unironically a good take.
This is probably an epic troll by the devs.
What does VCS stand for in this case?
Version Control System
At this point: That should be a synonym for "git" unless you REALLY have a good reason not to.
You underestimate how many companies with legacy code still run Subversion. Help.
Even legacy codebases get migrated easy. SVN etc. belongs in a museum. Best red flag for dead end dev job.
I got stuck organizing the migration at multiple companies. Believe me, I know how many people still use SVN (and CVS...)
And, in some cases, that is your "good reason not to" because of the disruption to development and needing to retrain devs. But it is also a migration that is well worth doing, if only because of how good Gitlab is.
Or Perforce, in gamedev circles
I prefer Plastic way more than Perforce, from a brief experience I had with it on one project. Too bad it's been a victim of the Unity's "buy it, paywall it" strategy, where getting a license is mostly unaffordable for smaller team.
You dont typically use git for VC on game files, git sucks a lot for binary files as it tries to diff them. You can use it for your codebase though.
You usually use SVN for game files like models, sounds, etc etc.
Version Control System - something like Git, Subversion, Unity Teams' VCS, etc.
Is this a dev? Theyre sayin a lot of "they" and not "we"
They're translating from an interview that was being given in Japanese.
That's pretty telling of the state of the industry nowadays.
How did this game even got discovered?
It's an indie. Indies just piece stuff together based on the experience of their devs.