Seems a very good feature to be able to track branch. Git submodules can only track commits IDs it’s not practical. This could replace git submodules and make it better !
I’m imagining some sort of art project where people try and figure out the most complicated software you can make entirely off files stitched together from different repos.
I was just thinking about this topic this morning, so quite timely post! I received a PR with binary test data files that were copied from an upstream ecosystem. Similarly,
I had copied files from that repo, but who should trust me?
Submodules seemed too heavyweight, so I was contemplating a CI script that compared selected checksums with a clone. This seems like a great general solution so will take a look at dropping it in!
I'm finding myself needing some resources from other projects in a way that ecosystem-specific dependency management isn't going to help, or I'd be pulling in too many files.
Submodules aren't the answer, and some other existing git user-defined commands don't seem to do what I need either.
I want a file from another repository, and the ability to pin it, and track it in the future, or just always be up-to-date by using the default HEAD commit value set in `.git-remote-files`.
been wanting to build something very similar, so sharing some notes (before actually getting to test it):
--dry-run
a "push" subcommand?
(especially in combination with 'overwrite to local repository-path" mentioned below, for remotes rater useless sure ;))
also your readme leaves "kinda open" what happens with the actual file(s),
`.git-remote-files` is mentioned "should be committed", but the file it cloned?
also a little unclear how `--save` plays into that (since the .git-remote-files example shows only a commit no branch)
(and when would one ever run it without save?)
cli-arg/secondary `.git-remote-files`-File
(possibly as secondary `.local.git-remote-files` or such that can also override repository-URLs)
for local/private repos?
option (autodetect?) to also write gitattributes to mark those picked files binary
(what may also be done into the repo, or local-only into the .git/ dir of the repo...)
since its called `git-fetch-file` and not `.git-remote-files` a overall comment may be nice as refence when first generating the file ;)
but by now i´m just rambling, looking forward to actually try it when home ;)
thanks in advance
I'll thinking about what would be some nice output for --dry-run. Do you have a desired behavior? Maybe something like this?
Would fetch:
src/utils.js from https://github.com/user/library.git (a1b2c3d -> f9e8d7c)
config/webpack.js from https://github.com/company/tools.git (HEAD -> 1a2b3c4)
Would skip (local changes):
docs/README.md from https://github.com/org/templates.git (use --force to overwrite)
Up to date:
package.json from https://github.com/user/library.git (f4e5d6c)
Push seems kinda neat for getting changes back to a remote!
I will try to make the README.md a little more clear about what happens after `pull`, because you're right, it's not specified, but files aren't actually committed, just placed in the directory for you to do as you please.
for --dry-run that looks pretty good yeps! I like the "--force hint" in there too!
for the "push", I think my idea was mostly about "local-remotes", think "I have both cloned locally, with both IDEs open going back&forth"
one injection there would be `../someupstream/file` vs. `../someupstream/.git/refs/HEAD:file`...
aka "pick the file as is" (potentially marked as "${HEAD}-dirty" vs. "only committed things are truth" (and if just so one doesn't need a extra `cp` command ;))
`just placed in the directory for you to do as you please.` could open a "--auto-commit" option -> based on a template similar to the dry run? (ideally overridable in .git-remote-files)
Good ideas. You mentioning your locally cloned repo was what spurred this. I wanted to do something a little more robust than just copy and paste the file from the other repository, but I also didn't want to inject the entire project as a dependency for a single file.
It feels like it would be excessive to perform a remote call if we already have the repository locally checked out, so I'll think that one over. I would like to add that!
I will also think more about automatically committing after `pull`, other standard git commands have --no-commit arguments, so this would be a bit different, since we're behaving a bit like `git-fetch`.
Edit: Would you like me to add you to a Contributors section of the README.md? Thanks for your input!
I´m mostly thinking in a "tell fetch-file that github.com/mine/scripts.git is already cloned to ~/devstuff/scripts/ fashion" - to then "skip the clone and use my provided folder instead of cloning into CACHE_DIR"
what shines combined with a push-subcommand when doing "active development" in the downstream repo - since it can shove the fix you made back into the upstream so you can commit and release it there
("push" may then also have a --with-commit option that 1) makes a commit in the local upstream 2) updates downstreams `.git-remote-files` commit-key)
on --no-commit idk... while i very much like the idea of it being able to auto-commit, default-enabled might be a little overreaching but not 100% certain on this... - maybe your `--commit <commit>` could become `--ref <ref>` instead to free up --(no-)commit? (ref might even be more git-nomenclature correct ;))
Yeah, I wasn't sure about the commit flag either! I opted to add the feature using a `--commit` flag with no arguments, and the default remaining to not commit, much like `fetch` itself.
Then, I made the automatic commit messages reflect a style that looked like git's standard automatic commit messages, with summaries for larger commits. Though upon reflection, maybe a list of file changes or updates would be more in-line with git's style.
Seems a very good feature to be able to track branch. Git submodules can only track commits IDs it’s not practical. This could replace git submodules and make it better !
I’m imagining some sort of art project where people try and figure out the most complicated software you can make entirely off files stitched together from different repos.
Or a Fuse filesystem based on it.
Haha! That is a pretty creative idea.
I was just thinking about this topic this morning, so quite timely post! I received a PR with binary test data files that were copied from an upstream ecosystem. Similarly, I had copied files from that repo, but who should trust me?
Submodules seemed too heavyweight, so I was contemplating a CI script that compared selected checksums with a clone. This seems like a great general solution so will take a look at dropping it in!
Thank you!
Hi HN, thanks for sending this to the front page.
I'm finding myself needing some resources from other projects in a way that ecosystem-specific dependency management isn't going to help, or I'd be pulling in too many files.
Submodules aren't the answer, and some other existing git user-defined commands don't seem to do what I need either.
I want a file from another repository, and the ability to pin it, and track it in the future, or just always be up-to-date by using the default HEAD commit value set in `.git-remote-files`.
Let's me track the README file from the octocat/Hello-World repository, and pull down the file. A record of it is then saved in `.git-remote-files`.Let me know if you have any questions!
been wanting to build something very similar, so sharing some notes (before actually getting to test it):
--dry-run
a "push" subcommand? (especially in combination with 'overwrite to local repository-path" mentioned below, for remotes rater useless sure ;))
also your readme leaves "kinda open" what happens with the actual file(s), `.git-remote-files` is mentioned "should be committed", but the file it cloned?
also a little unclear how `--save` plays into that (since the .git-remote-files example shows only a commit no branch) (and when would one ever run it without save?)
cli-arg/secondary `.git-remote-files`-File (possibly as secondary `.local.git-remote-files` or such that can also override repository-URLs) for local/private repos?
option (autodetect?) to also write gitattributes to mark those picked files binary (what may also be done into the repo, or local-only into the .git/ dir of the repo...)
since its called `git-fetch-file` and not `.git-remote-files` a overall comment may be nice as refence when first generating the file ;)
but by now i´m just rambling, looking forward to actually try it when home ;) thanks in advance
Thank you for your comments!
I'll thinking about what would be some nice output for --dry-run. Do you have a desired behavior? Maybe something like this?
Push seems kinda neat for getting changes back to a remote!I will try to make the README.md a little more clear about what happens after `pull`, because you're right, it's not specified, but files aren't actually committed, just placed in the directory for you to do as you please.
I like your ideas! Thank you!
for --dry-run that looks pretty good yeps! I like the "--force hint" in there too!
for the "push", I think my idea was mostly about "local-remotes", think "I have both cloned locally, with both IDEs open going back&forth"
one injection there would be `../someupstream/file` vs. `../someupstream/.git/refs/HEAD:file`... aka "pick the file as is" (potentially marked as "${HEAD}-dirty" vs. "only committed things are truth" (and if just so one doesn't need a extra `cp` command ;))
`just placed in the directory for you to do as you please.` could open a "--auto-commit" option -> based on a template similar to the dry run? (ideally overridable in .git-remote-files)
Good ideas. You mentioning your locally cloned repo was what spurred this. I wanted to do something a little more robust than just copy and paste the file from the other repository, but I also didn't want to inject the entire project as a dependency for a single file.
It feels like it would be excessive to perform a remote call if we already have the repository locally checked out, so I'll think that one over. I would like to add that!
I will also think more about automatically committing after `pull`, other standard git commands have --no-commit arguments, so this would be a bit different, since we're behaving a bit like `git-fetch`.
Edit: Would you like me to add you to a Contributors section of the README.md? Thanks for your input!
I´m mostly thinking in a "tell fetch-file that github.com/mine/scripts.git is already cloned to ~/devstuff/scripts/ fashion" - to then "skip the clone and use my provided folder instead of cloning into CACHE_DIR"
what shines combined with a push-subcommand when doing "active development" in the downstream repo - since it can shove the fix you made back into the upstream so you can commit and release it there
("push" may then also have a --with-commit option that 1) makes a commit in the local upstream 2) updates downstreams `.git-remote-files` commit-key)
on --no-commit idk... while i very much like the idea of it being able to auto-commit, default-enabled might be a little overreaching but not 100% certain on this... - maybe your `--commit <commit>` could become `--ref <ref>` instead to free up --(no-)commit? (ref might even be more git-nomenclature correct ;))
readme: no thanks, just on a throwaway anyhow ;)
Yeah, I wasn't sure about the commit flag either! I opted to add the feature using a `--commit` flag with no arguments, and the default remaining to not commit, much like `fetch` itself.
Then, I made the automatic commit messages reflect a style that looked like git's standard automatic commit messages, with summaries for larger commits. Though upon reflection, maybe a list of file changes or updates would be more in-line with git's style.
Maybe that's close enough.
Man this is exactly what I was looking for. Thank you!
I'm so glad this is helpful for you, too!