2006-02-16 12:24:16 +03:00
|
|
|
#!/usr/bin/env perl
|
2006-02-20 21:57:29 +03:00
|
|
|
# Copyright (C) 2006, Eric Wong <normalperson@yhbt.net>
|
|
|
|
# License: GPL v2 or later
|
2006-02-16 12:24:16 +03:00
|
|
|
use warnings;
|
|
|
|
use strict;
|
|
|
|
use vars qw/ $AUTHOR $VERSION
|
2007-01-10 12:22:38 +03:00
|
|
|
$SVN_URL
|
2006-02-16 12:24:16 +03:00
|
|
|
$GIT_SVN_INDEX $GIT_SVN
|
2007-01-12 13:35:20 +03:00
|
|
|
$GIT_DIR $GIT_SVN_DIR $REVDB
|
2007-01-12 13:49:01 +03:00
|
|
|
$_follow_parent $sha1 $sha1_short $_revision
|
2007-01-14 09:35:53 +03:00
|
|
|
$_cp_remote $_upgrade $_rmdir $_q $_cp_similarity
|
2007-01-14 13:17:00 +03:00
|
|
|
$_find_copies_harder $_l $_authors %users/;
|
2006-02-16 12:24:16 +03:00
|
|
|
$AUTHOR = 'Eric Wong <normalperson@yhbt.net>';
|
2006-07-06 11:14:16 +04:00
|
|
|
$VERSION = '@@GIT_VERSION@@';
|
2006-03-30 10:37:18 +04:00
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
$ENV{GIT_DIR} ||= '.git';
|
|
|
|
$Git::SVN::default_repo_id = $ENV{GIT_SVN_ID} || 'git-svn';
|
2006-03-30 10:37:18 +04:00
|
|
|
|
2006-06-03 02:16:41 +04:00
|
|
|
my $LC_ALL = $ENV{LC_ALL};
|
2007-01-12 13:35:20 +03:00
|
|
|
$Git::SVN::Log::TZ = $ENV{TZ};
|
2006-02-16 12:24:16 +03:00
|
|
|
# make sure the svn binary gives consistent output between locales and TZs:
|
|
|
|
$ENV{TZ} = 'UTC';
|
|
|
|
$ENV{LC_ALL} = 'C';
|
git-svn: add --follow-parent and --no-metadata options to fetch
--follow-parent:
This is especially helpful when we're tracking a directory
that has been moved around within the repository, or if we
started tracking a branch and never tracked the trunk it was
descended from.
This relies on the SVN::* libraries to work. We can't
reliably parse path info from the svn command-line client
without relying on XML, so it's better just to have the SVN::*
libs installed.
This also removes oldvalue verification when calling update-ref
In SVN, branches can be deleted, and then recreated under the
same path as the original one with different ancestry
information, causing parent information to be mismatched /
misordered.
Also force the current ref, if existing, to be a parent,
regardless of whether or not it was specified.
--no-metadata:
This gets rid of the git-svn-id: lines at the end of every commit.
With this, you lose the ability to use the rebuild command. If
you ever lose your .git/svn/git-svn/.rev_db file, you won't be
able to fetch again, either. This is fine for one-shot imports.
Also fix some issues with multi-fetch --follow-parent that were
exposed while testing this. Additionally, repack checking is
simplified greatly.
git-svn log will not work on repositories using this, either.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
2006-06-28 06:39:13 +04:00
|
|
|
$| = 1; # unbuffer STDOUT
|
2006-02-16 12:24:16 +03:00
|
|
|
|
2006-12-13 01:47:02 +03:00
|
|
|
sub fatal (@) { print STDERR @_; exit 1 }
|
2006-12-16 10:58:07 +03:00
|
|
|
require SVN::Core; # use()-ing this causes segfaults for me... *shrug*
|
|
|
|
require SVN::Ra;
|
|
|
|
require SVN::Delta;
|
|
|
|
if ($SVN::Core::VERSION lt '1.1.0') {
|
|
|
|
fatal "Need SVN::Core 1.1.0 or better (got $SVN::Core::VERSION)\n";
|
|
|
|
}
|
2007-01-10 12:22:38 +03:00
|
|
|
push @Git::SVN::Ra::ISA, 'SVN::Ra';
|
2006-12-16 10:58:07 +03:00
|
|
|
push @SVN::Git::Editor::ISA, 'SVN::Delta::Editor';
|
|
|
|
push @SVN::Git::Fetcher::ISA, 'SVN::Delta::Editor';
|
2006-02-16 12:24:16 +03:00
|
|
|
use Carp qw/croak/;
|
|
|
|
use IO::File qw//;
|
|
|
|
use File::Basename qw/dirname basename/;
|
|
|
|
use File::Path qw/mkpath/;
|
2006-06-01 13:35:44 +04:00
|
|
|
use Getopt::Long qw/:config gnu_getopt no_ignore_case auto_abbrev pass_through/;
|
2006-06-16 00:36:12 +04:00
|
|
|
use IPC::Open3;
|
2007-01-11 13:14:43 +03:00
|
|
|
use Git;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
|
2007-01-11 13:14:43 +03:00
|
|
|
BEGIN {
|
|
|
|
my $s;
|
|
|
|
foreach (qw/command command_oneline command_noisy command_output_pipe
|
|
|
|
command_input_pipe command_close_pipe/) {
|
|
|
|
$s .= "*SVN::Git::Editor::$_ = *SVN::Git::Fetcher::$_ = ".
|
2007-01-19 04:50:01 +03:00
|
|
|
"*Git::SVN::Migration::$_ = ".
|
2007-01-12 13:35:20 +03:00
|
|
|
"*Git::SVN::Log::$_ = *Git::SVN::$_ = *$_ = *Git::$_; ";
|
2007-01-11 13:14:43 +03:00
|
|
|
}
|
|
|
|
eval $s;
|
|
|
|
}
|
|
|
|
|
2006-12-16 10:58:07 +03:00
|
|
|
my ($SVN);
|
2006-10-12 05:19:55 +04:00
|
|
|
|
2006-06-13 15:02:23 +04:00
|
|
|
my $_optimize_commits = 1 unless $ENV{GIT_SVN_NO_OPTIMIZE_COMMITS};
|
2007-01-12 13:35:20 +03:00
|
|
|
$sha1 = qr/[a-f\d]{40}/;
|
|
|
|
$sha1_short = qr/[a-f\d]{4,40}/;
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($_stdin, $_help, $_edit,
|
|
|
|
$_repack, $_repack_nr, $_repack_flags,
|
2007-01-10 12:22:38 +03:00
|
|
|
$_message, $_file, $_no_metadata,
|
2007-01-16 09:59:26 +03:00
|
|
|
$_template, $_shared,
|
|
|
|
$_version, $_upgrade,
|
2007-01-12 13:35:20 +03:00
|
|
|
$_merge, $_strategy, $_dry_run,
|
|
|
|
$_prefix);
|
2006-02-16 12:24:16 +03:00
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
my %remote_opts = ( 'username=s' => \$Git::SVN::Prompt::_username,
|
|
|
|
'config-dir=s' => \$Git::SVN::Ra::config_dir,
|
|
|
|
'no-auth-cache' => \$Git::SVN::Prompt::_no_auth_cache );
|
2007-01-16 09:59:26 +03:00
|
|
|
my %fc_opts = ( 'follow-parent|follow' => \$_follow_parent,
|
2006-05-24 13:07:32 +04:00
|
|
|
'authors-file|A=s' => \$_authors,
|
|
|
|
'repack:i' => \$_repack,
|
git-svn: add --follow-parent and --no-metadata options to fetch
--follow-parent:
This is especially helpful when we're tracking a directory
that has been moved around within the repository, or if we
started tracking a branch and never tracked the trunk it was
descended from.
This relies on the SVN::* libraries to work. We can't
reliably parse path info from the svn command-line client
without relying on XML, so it's better just to have the SVN::*
libs installed.
This also removes oldvalue verification when calling update-ref
In SVN, branches can be deleted, and then recreated under the
same path as the original one with different ancestry
information, causing parent information to be mismatched /
misordered.
Also force the current ref, if existing, to be a parent,
regardless of whether or not it was specified.
--no-metadata:
This gets rid of the git-svn-id: lines at the end of every commit.
With this, you lose the ability to use the rebuild command. If
you ever lose your .git/svn/git-svn/.rev_db file, you won't be
able to fetch again, either. This is fine for one-shot imports.
Also fix some issues with multi-fetch --follow-parent that were
exposed while testing this. Additionally, repack checking is
simplified greatly.
git-svn log will not work on repositories using this, either.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
2006-06-28 06:39:13 +04:00
|
|
|
'no-metadata' => \$_no_metadata,
|
2006-06-28 06:39:14 +04:00
|
|
|
'quiet|q' => \$_q,
|
2007-01-19 04:50:01 +03:00
|
|
|
'repack-flags|repack-args|repack-opts=s' => \$_repack_flags,
|
|
|
|
%remote_opts );
|
2006-05-24 06:23:41 +04:00
|
|
|
|
2006-06-13 02:53:13 +04:00
|
|
|
my ($_trunk, $_tags, $_branches);
|
|
|
|
my %multi_opts = ( 'trunk|T=s' => \$_trunk,
|
|
|
|
'tags|t=s' => \$_tags,
|
|
|
|
'branches|b=s' => \$_branches );
|
|
|
|
my %init_opts = ( 'template=s' => \$_template, 'shared' => \$_shared );
|
2006-06-28 06:39:12 +04:00
|
|
|
my %cmt_opts = ( 'edit|e' => \$_edit,
|
|
|
|
'rmdir' => \$_rmdir,
|
|
|
|
'find-copies-harder' => \$_find_copies_harder,
|
|
|
|
'l=i' => \$_l,
|
|
|
|
'copy-similarity|C=i'=> \$_cp_similarity
|
|
|
|
);
|
2006-06-13 02:53:13 +04:00
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
my %cmd = (
|
2007-01-05 05:09:56 +03:00
|
|
|
fetch => [ \&cmd_fetch, "Download new revisions from SVN",
|
2006-03-03 12:20:08 +03:00
|
|
|
{ 'revision|r=s' => \$_revision, %fc_opts } ],
|
2007-01-11 23:26:16 +03:00
|
|
|
init => [ \&cmd_init, "Initialize a repo for tracking" .
|
2006-06-01 02:49:56 +04:00
|
|
|
" (requires URL argument)",
|
2006-06-13 02:53:13 +04:00
|
|
|
\%init_opts ],
|
2007-01-14 14:14:28 +03:00
|
|
|
dcommit => [ \&cmd_dcommit,
|
|
|
|
'Commit several diffs to merge with upstream',
|
2006-12-16 10:58:08 +03:00
|
|
|
{ 'merge|m|M' => \$_merge,
|
|
|
|
'strategy|s=s' => \$_strategy,
|
|
|
|
'dry-run|n' => \$_dry_run,
|
2006-12-23 08:59:24 +03:00
|
|
|
%cmt_opts, %fc_opts } ],
|
2007-01-15 10:21:16 +03:00
|
|
|
'set-tree' => [ \&cmd_set_tree,
|
|
|
|
"Set an SVN repository to a git tree-ish",
|
|
|
|
{ 'stdin|' => \$_stdin, %cmt_opts, %fc_opts, } ],
|
2007-01-12 04:58:39 +03:00
|
|
|
'show-ignore' => [ \&cmd_show_ignore, "Show svn:ignore listings",
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
{ 'revision|r=i' => \$_revision } ],
|
2007-01-12 13:49:01 +03:00
|
|
|
rebuild => [ \&cmd_rebuild, "Rebuild git-svn metadata (after git clone)",
|
2007-01-11 13:14:43 +03:00
|
|
|
{ 'copy-remote|remote=s' => \$_cp_remote,
|
2006-03-03 12:20:08 +03:00
|
|
|
'upgrade' => \$_upgrade } ],
|
2007-01-12 02:35:55 +03:00
|
|
|
'multi-init' => [ \&cmd_multi_init,
|
2006-06-13 02:53:13 +04:00
|
|
|
'Initialize multiple trees (like git-svnimport)',
|
2007-01-19 04:50:01 +03:00
|
|
|
{ %multi_opts, %init_opts, %remote_opts,
|
2006-11-29 05:51:42 +03:00
|
|
|
'revision|r=i' => \$_revision,
|
2007-01-08 06:35:40 +03:00
|
|
|
'prefix=s' => \$_prefix,
|
2006-11-29 05:51:42 +03:00
|
|
|
} ],
|
2007-01-14 13:17:00 +03:00
|
|
|
'multi-fetch' => [ \&cmd_multi_fetch,
|
2006-06-13 02:53:13 +04:00
|
|
|
'Fetch multiple trees (like git-svnimport)',
|
|
|
|
\%fc_opts ],
|
2007-01-19 04:50:01 +03:00
|
|
|
'migrate' => [ sub { },
|
|
|
|
# no-op, we automatically run this anyways,
|
|
|
|
'Migrate configuration/metadata/layout from
|
|
|
|
previous versions of git-svn',
|
|
|
|
\%remote_opts ],
|
2007-01-12 13:35:20 +03:00
|
|
|
'log' => [ \&Git::SVN::Log::cmd_show_log, 'Show commit logs',
|
|
|
|
{ 'limit=i' => \$Git::SVN::Log::limit,
|
2006-06-01 13:35:44 +04:00
|
|
|
'revision|r=s' => \$_revision,
|
2007-01-12 13:35:20 +03:00
|
|
|
'verbose|v' => \$Git::SVN::Log::verbose,
|
|
|
|
'incremental' => \$Git::SVN::Log::incremental,
|
|
|
|
'oneline' => \$Git::SVN::Log::oneline,
|
|
|
|
'show-commit' => \$Git::SVN::Log::show_commit,
|
|
|
|
'non-recursive' => \$Git::SVN::Log::non_recursive,
|
2006-06-01 13:35:44 +04:00
|
|
|
'authors-file|A=s' => \$_authors,
|
2007-01-12 13:35:20 +03:00
|
|
|
'color' => \$Git::SVN::Log::color,
|
|
|
|
'pager=s' => \$Git::SVN::Log::pager,
|
2006-06-01 13:35:44 +04:00
|
|
|
} ],
|
2007-01-14 09:35:53 +03:00
|
|
|
'commit-diff' => [ \&cmd_commit_diff,
|
|
|
|
'Commit a diff between two trees',
|
2006-06-28 06:39:12 +04:00
|
|
|
{ 'message|m=s' => \$_message,
|
|
|
|
'file|F=s' => \$_file,
|
2006-11-09 12:19:37 +03:00
|
|
|
'revision|r=s' => \$_revision,
|
2006-06-28 06:39:12 +04:00
|
|
|
%cmt_opts } ],
|
2006-02-16 12:24:16 +03:00
|
|
|
);
|
2006-06-13 02:53:13 +04:00
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
my $cmd;
|
|
|
|
for (my $i = 0; $i < @ARGV; $i++) {
|
|
|
|
if (defined $cmd{$ARGV[$i]}) {
|
|
|
|
$cmd = $ARGV[$i];
|
|
|
|
splice @ARGV, $i, 1;
|
|
|
|
last;
|
|
|
|
}
|
|
|
|
};
|
|
|
|
|
2006-03-03 12:20:09 +03:00
|
|
|
my %opts = %{$cmd{$cmd}->[2]} if (defined $cmd);
|
2006-03-03 12:20:08 +03:00
|
|
|
|
2006-05-24 12:40:37 +04:00
|
|
|
read_repo_config(\%opts);
|
2006-06-01 13:35:44 +04:00
|
|
|
my $rv = GetOptions(%opts, 'help|H|h' => \$_help,
|
|
|
|
'version|V' => \$_version,
|
2007-01-21 15:27:09 +03:00
|
|
|
'minimize-connections' =>
|
|
|
|
\$Git::SVN::Migration::_minimize,
|
2007-01-19 04:50:01 +03:00
|
|
|
'id|i=s' => \$Git::SVN::default_repo_id);
|
2006-06-01 13:35:44 +04:00
|
|
|
exit 1 if (!$rv && $cmd ne 'log');
|
2006-03-03 12:20:09 +03:00
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
usage(0) if $_help;
|
2006-02-20 21:57:29 +03:00
|
|
|
version() if $_version;
|
2006-03-03 12:20:08 +03:00
|
|
|
usage(1) unless defined $cmd;
|
|
|
|
load_authors() if $_authors;
|
2007-01-19 04:50:01 +03:00
|
|
|
unless ($cmd =~ /^(?:init|rebuild|multi-init|commit-diff)$/) {
|
|
|
|
Git::SVN::Migration::migration_check();
|
|
|
|
}
|
2006-02-16 12:24:16 +03:00
|
|
|
$cmd{$cmd}->[0]->(@ARGV);
|
|
|
|
exit 0;
|
|
|
|
|
|
|
|
####################### primary functions ######################
|
|
|
|
sub usage {
|
|
|
|
my $exit = shift || 0;
|
|
|
|
my $fd = $exit ? \*STDERR : \*STDOUT;
|
|
|
|
print $fd <<"";
|
|
|
|
git-svn - bidirectional operations between a single Subversion tree and git
|
|
|
|
Usage: $0 <command> [options] [arguments]\n
|
2006-03-03 12:20:09 +03:00
|
|
|
|
|
|
|
print $fd "Available commands:\n" unless $cmd;
|
2006-02-16 12:24:16 +03:00
|
|
|
|
|
|
|
foreach (sort keys %cmd) {
|
2006-03-03 12:20:09 +03:00
|
|
|
next if $cmd && $cmd ne $_;
|
2006-10-12 01:53:36 +04:00
|
|
|
print $fd ' ',pack('A17',$_),$cmd{$_}->[1],"\n";
|
2006-03-03 12:20:09 +03:00
|
|
|
foreach (keys %{$cmd{$_}->[2]}) {
|
|
|
|
# prints out arguments as they should be passed:
|
2006-05-24 12:40:37 +04:00
|
|
|
my $x = s#[:=]s$## ? '<arg>' : s#[:=]i$## ? '<num>' : '';
|
2006-10-12 01:53:36 +04:00
|
|
|
print $fd ' ' x 21, join(', ', map { length $_ > 1 ?
|
2006-03-03 12:20:09 +03:00
|
|
|
"--$_" : "-$_" }
|
|
|
|
split /\|/,$_)," $x\n";
|
|
|
|
}
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
|
|
|
print $fd <<"";
|
2006-03-03 12:20:09 +03:00
|
|
|
\nGIT_SVN_ID may be set in the environment or via the --id/-i switch to an
|
|
|
|
arbitrary identifier if you're tracking multiple SVN branches/repositories in
|
|
|
|
one git repository and want to keep them separate. See git-svn(1) for more
|
|
|
|
information.
|
2006-02-16 12:24:16 +03:00
|
|
|
|
|
|
|
exit $exit;
|
|
|
|
}
|
|
|
|
|
2006-02-20 21:57:29 +03:00
|
|
|
sub version {
|
2006-12-28 12:16:20 +03:00
|
|
|
print "git-svn version $VERSION (svn $SVN::Core::VERSION)\n";
|
2006-02-20 21:57:29 +03:00
|
|
|
exit 0;
|
|
|
|
}
|
|
|
|
|
2007-01-12 13:49:01 +03:00
|
|
|
sub cmd_rebuild {
|
|
|
|
my $url = shift;
|
2007-01-19 04:50:01 +03:00
|
|
|
my $gs = $url ? Git::SVN->init($url)
|
2007-01-12 13:49:01 +03:00
|
|
|
: eval { Git::SVN->new };
|
|
|
|
$gs ||= Git::SVN->_new;
|
|
|
|
if (!verify_ref($gs->refname.'^0')) {
|
|
|
|
$gs->copy_remote_ref;
|
2006-06-16 13:55:13 +04:00
|
|
|
}
|
2006-02-16 12:24:16 +03:00
|
|
|
|
2007-01-12 13:49:01 +03:00
|
|
|
my ($rev_list, $ctx) = command_output_pipe("rev-list", $gs->refname);
|
2006-03-02 08:58:31 +03:00
|
|
|
my $latest;
|
2007-01-10 12:22:38 +03:00
|
|
|
my $svn_uuid;
|
2006-02-16 12:24:16 +03:00
|
|
|
while (<$rev_list>) {
|
|
|
|
chomp;
|
|
|
|
my $c = $_;
|
2007-01-12 13:49:01 +03:00
|
|
|
fatal "Non-SHA1: $c\n" unless $c =~ /^$sha1$/o;
|
|
|
|
my ($url, $rev, $uuid) = cmt_metadata($c);
|
|
|
|
|
|
|
|
# ignore merges (from set-tree)
|
|
|
|
next if (!defined $rev || !$uuid);
|
2006-03-02 08:58:31 +03:00
|
|
|
|
|
|
|
# if we merged or otherwise started elsewhere, this is
|
|
|
|
# how we break out of it
|
2007-01-12 13:49:01 +03:00
|
|
|
if ((defined $svn_uuid && ($uuid ne $svn_uuid)) ||
|
|
|
|
($gs->{url} && $url && ($url ne $gs->{url}))) {
|
|
|
|
next;
|
|
|
|
}
|
2006-03-02 08:58:31 +03:00
|
|
|
|
|
|
|
unless (defined $latest) {
|
2007-01-12 13:49:01 +03:00
|
|
|
if (!$gs->{url} && !$url) {
|
|
|
|
fatal "SVN repository location required\n";
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
$gs = Git::SVN->init($url);
|
2006-03-02 08:58:31 +03:00
|
|
|
$latest = $rev;
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2007-01-12 13:49:01 +03:00
|
|
|
$gs->rev_db_set($rev, $c);
|
2006-06-13 15:02:23 +04:00
|
|
|
print "r$rev = $c\n";
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2006-12-15 21:59:54 +03:00
|
|
|
command_close_pipe($rev_list, $ctx);
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
|
|
|
|
2007-01-12 02:35:55 +03:00
|
|
|
sub do_git_init_db {
|
|
|
|
unless (-d $ENV{GIT_DIR}) {
|
|
|
|
my @init_db = ('init');
|
|
|
|
push @init_db, "--template=$_template" if defined $_template;
|
|
|
|
push @init_db, "--shared" if defined $_shared;
|
|
|
|
command_noisy(@init_db);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:26:16 +03:00
|
|
|
sub cmd_init {
|
2006-07-01 08:42:53 +04:00
|
|
|
my $url = shift or die "SVN repository location required " .
|
2006-05-05 23:35:39 +04:00
|
|
|
"as a command-line argument\n";
|
2006-07-01 08:42:53 +04:00
|
|
|
if (my $repo_path = shift) {
|
|
|
|
unless (-d $repo_path) {
|
|
|
|
mkpath([$repo_path]);
|
|
|
|
}
|
2007-01-11 23:26:16 +03:00
|
|
|
chdir $repo_path or croak $!;
|
|
|
|
$ENV{GIT_DIR} = $repo_path . "/.git";
|
2006-07-01 08:42:53 +04:00
|
|
|
}
|
2007-01-12 02:35:55 +03:00
|
|
|
do_git_init_db();
|
2006-07-01 08:42:53 +04:00
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
Git::SVN->init($url);
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
|
|
|
|
2007-01-05 05:09:56 +03:00
|
|
|
sub cmd_fetch {
|
2007-01-14 13:17:00 +03:00
|
|
|
my $gs = Git::SVN->new;
|
|
|
|
$gs->fetch(@_);
|
|
|
|
if ($gs->{last_commit} && !verify_ref('refs/heads/master^0')) {
|
|
|
|
command_noisy(qw(update-ref refs/heads/master),
|
|
|
|
$gs->{last_commit});
|
|
|
|
}
|
2007-01-05 05:09:56 +03:00
|
|
|
}
|
|
|
|
|
2007-01-15 10:21:16 +03:00
|
|
|
sub cmd_set_tree {
|
2006-02-16 12:24:16 +03:00
|
|
|
my (@commits) = @_;
|
|
|
|
if ($_stdin || !@commits) {
|
|
|
|
print "Reading from stdin...\n";
|
|
|
|
@commits = ();
|
|
|
|
while (<STDIN>) {
|
2006-03-03 12:20:09 +03:00
|
|
|
if (/\b($sha1_short)\b/o) {
|
2006-02-16 12:24:16 +03:00
|
|
|
unshift @commits, $1;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my @revs;
|
2006-02-20 21:57:26 +03:00
|
|
|
foreach my $c (@commits) {
|
2006-12-15 21:59:54 +03:00
|
|
|
my @tmp = command('rev-parse',$c);
|
2006-02-20 21:57:26 +03:00
|
|
|
if (scalar @tmp == 1) {
|
|
|
|
push @revs, $tmp[0];
|
|
|
|
} elsif (scalar @tmp > 1) {
|
2006-12-15 21:59:54 +03:00
|
|
|
push @revs, reverse(command('rev-list',@tmp));
|
2006-02-20 21:57:26 +03:00
|
|
|
} else {
|
2007-01-15 10:21:16 +03:00
|
|
|
fatal "Failed to rev-parse $c\n";
|
2006-02-20 21:57:26 +03:00
|
|
|
}
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2007-01-15 10:21:16 +03:00
|
|
|
my $gs = Git::SVN->new;
|
|
|
|
my ($r_last, $cmt_last) = $gs->last_rev_commit;
|
|
|
|
$gs->fetch;
|
|
|
|
if ($r_last != $gs->{last_rev}) {
|
|
|
|
fatal "There are new revisions that were fetched ",
|
|
|
|
"and need to be merged (or acknowledged) ",
|
|
|
|
"before committing.\nlast rev: $r_last\n",
|
|
|
|
" current: $gs->{last_rev}\n";
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
2007-01-15 10:21:16 +03:00
|
|
|
$gs->set_tree($_) foreach @revs;
|
|
|
|
print "Done committing ",scalar @revs," revisions to SVN\n";
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
2006-02-26 13:22:27 +03:00
|
|
|
|
2007-01-14 14:14:28 +03:00
|
|
|
sub cmd_dcommit {
|
|
|
|
my $head = shift;
|
|
|
|
my $gs = Git::SVN->new;
|
|
|
|
$head ||= 'HEAD';
|
|
|
|
my @refs = command(qw/rev-list --no-merges/, $gs->refname."..$head");
|
2006-11-09 12:19:37 +03:00
|
|
|
my $last_rev;
|
2006-08-26 11:01:23 +04:00
|
|
|
foreach my $d (reverse @refs) {
|
2006-12-15 21:59:54 +03:00
|
|
|
if (!verify_ref("$d~1")) {
|
2007-01-14 14:14:28 +03:00
|
|
|
fatal "Commit $d\n",
|
|
|
|
"has no parent commit, and therefore ",
|
|
|
|
"nothing to diff against.\n",
|
|
|
|
"You should be working from a repository ",
|
|
|
|
"originally created by git-svn\n";
|
2006-11-24 01:54:03 +03:00
|
|
|
}
|
2006-11-09 12:19:37 +03:00
|
|
|
unless (defined $last_rev) {
|
|
|
|
(undef, $last_rev, undef) = cmt_metadata("$d~1");
|
|
|
|
unless (defined $last_rev) {
|
2007-01-14 14:14:28 +03:00
|
|
|
fatal "Unable to extract revision information ",
|
|
|
|
"from commit $d~1\n";
|
2006-11-09 12:19:37 +03:00
|
|
|
}
|
|
|
|
}
|
2006-08-26 11:01:23 +04:00
|
|
|
if ($_dry_run) {
|
|
|
|
print "diff-tree $d~1 $d\n";
|
|
|
|
} else {
|
2007-01-19 05:15:23 +03:00
|
|
|
my $log = get_commit_entry($d)->{log};
|
2007-01-14 14:14:28 +03:00
|
|
|
my $ra = $gs->ra;
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my %ed_opts = ( r => $last_rev,
|
|
|
|
ra => $ra->dup,
|
|
|
|
svn_path => $ra->{svn_path} );
|
|
|
|
my $ed = SVN::Git::Editor->new(\%ed_opts,
|
2007-01-19 05:15:23 +03:00
|
|
|
$ra->get_commit_editor($log,
|
2007-01-14 14:14:28 +03:00
|
|
|
sub { print "Committed r$_[0]\n";
|
|
|
|
$last_rev = $_[0]; }),
|
|
|
|
$pool);
|
|
|
|
my $mods = $ed->apply_diff("$d~1", $d);
|
|
|
|
if (@$mods == 0) {
|
|
|
|
print "No changes\n$d~1 == $d\n";
|
|
|
|
}
|
2006-08-26 11:01:23 +04:00
|
|
|
}
|
|
|
|
}
|
|
|
|
return if $_dry_run;
|
2007-01-14 14:14:28 +03:00
|
|
|
$gs->fetch;
|
|
|
|
# we always want to rebase against the current HEAD, not any
|
|
|
|
# head that was passed to us
|
|
|
|
my @diff = command('diff-tree', 'HEAD', $gs->refname, '--');
|
2006-08-26 11:01:23 +04:00
|
|
|
my @finish;
|
|
|
|
if (@diff) {
|
|
|
|
@finish = qw/rebase/;
|
|
|
|
push @finish, qw/--merge/ if $_merge;
|
|
|
|
push @finish, "--strategy=$_strategy" if $_strategy;
|
2007-01-14 14:14:28 +03:00
|
|
|
print STDERR "W: HEAD and ", $gs->refname, " differ, ",
|
|
|
|
"using @finish:\n", "@diff";
|
2006-08-26 11:01:23 +04:00
|
|
|
} else {
|
2007-01-14 14:14:28 +03:00
|
|
|
print "No changes between current HEAD and ",
|
|
|
|
$gs->refname, "\nResetting to the latest ",
|
|
|
|
$gs->refname, "\n";
|
2006-11-24 01:54:05 +03:00
|
|
|
@finish = qw/reset --mixed/;
|
2006-08-26 11:01:23 +04:00
|
|
|
}
|
2007-01-14 14:14:28 +03:00
|
|
|
command_noisy(@finish, $gs->refname);
|
2006-08-26 11:01:23 +04:00
|
|
|
}
|
|
|
|
|
2007-01-12 04:58:39 +03:00
|
|
|
sub cmd_show_ignore {
|
|
|
|
my $gs = Git::SVN->new;
|
|
|
|
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
|
|
|
|
$gs->traverse_ignore(\*STDOUT, '', $r);
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
|
|
|
|
2007-01-12 02:35:55 +03:00
|
|
|
sub cmd_multi_init {
|
2006-06-13 02:53:13 +04:00
|
|
|
my $url = shift;
|
2007-01-05 05:02:00 +03:00
|
|
|
unless (defined $_trunk || defined $_branches || defined $_tags) {
|
|
|
|
usage(1);
|
2006-06-13 02:53:13 +04:00
|
|
|
}
|
2007-01-12 02:35:55 +03:00
|
|
|
do_git_init_db();
|
|
|
|
$_prefix = '' unless defined $_prefix;
|
2007-01-19 04:50:01 +03:00
|
|
|
$url =~ s#/+$## if defined $url;
|
2007-01-05 05:02:00 +03:00
|
|
|
if (defined $_trunk) {
|
2007-01-19 04:50:01 +03:00
|
|
|
my $trunk_ref = $_prefix . 'trunk';
|
|
|
|
# try both old-style and new-style lookups:
|
|
|
|
my $gs_trunk = eval { Git::SVN->new($trunk_ref) };
|
2007-01-12 02:35:55 +03:00
|
|
|
unless ($gs_trunk) {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($trunk_url, $trunk_path) =
|
|
|
|
complete_svn_url($url, $_trunk);
|
|
|
|
$gs_trunk = Git::SVN->init($trunk_url, $trunk_path,
|
|
|
|
undef, $trunk_ref);
|
2007-01-05 05:02:00 +03:00
|
|
|
}
|
2006-10-11 22:53:21 +04:00
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
return unless defined $_branches || defined $_tags;
|
2007-01-12 04:09:26 +03:00
|
|
|
my $ra = $url ? Git::SVN::Ra->new($url) : undef;
|
|
|
|
complete_url_ls_init($ra, $_branches, '--branches/-b', $_prefix);
|
|
|
|
complete_url_ls_init($ra, $_tags, '--tags/-t', $_prefix . 'tags/');
|
2006-06-13 02:53:13 +04:00
|
|
|
}
|
|
|
|
|
2007-01-14 13:17:00 +03:00
|
|
|
sub cmd_multi_fetch {
|
2007-01-19 04:50:01 +03:00
|
|
|
my @gs;
|
|
|
|
foreach (command(qw/config -l/)) {
|
|
|
|
next unless m!^svn-remote\.(.+)\.fetch=
|
|
|
|
\s*(.*)\s*:\s*refs/remotes/(.+)\s*$!x;
|
|
|
|
my ($repo_id, $path, $ref_id) = ($1, $2, $3);
|
|
|
|
push @gs, Git::SVN->new($ref_id, $repo_id, $path);
|
|
|
|
}
|
|
|
|
foreach (@gs) {
|
|
|
|
$_->fetch;
|
2006-06-13 02:53:13 +04:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
# this command is special because it requires no metadata
|
|
|
|
sub cmd_commit_diff {
|
|
|
|
my ($ta, $tb, $url) = @_;
|
|
|
|
my $usage = "Usage: $0 commit-diff -r<revision> ".
|
|
|
|
"<tree-ish> <tree-ish> [<URL>]\n";
|
|
|
|
fatal($usage) if (!defined $ta || !defined $tb);
|
|
|
|
if (!defined $url) {
|
|
|
|
my $gs = eval { Git::SVN->new };
|
|
|
|
if (!$gs) {
|
|
|
|
fatal("Needed URL or usable git-svn --id in ",
|
|
|
|
"the command-line\n", $usage);
|
|
|
|
}
|
|
|
|
$url = $gs->{url};
|
|
|
|
}
|
|
|
|
unless (defined $_revision) {
|
|
|
|
fatal("-r|--revision is a required argument\n", $usage);
|
|
|
|
}
|
|
|
|
if (defined $_message && defined $_file) {
|
|
|
|
fatal("Both --message/-m and --file/-F specified ",
|
|
|
|
"for the commit message.\n",
|
|
|
|
"I have no idea what you mean\n");
|
|
|
|
}
|
|
|
|
if (defined $_file) {
|
|
|
|
$_message = file_to_s($_file);
|
|
|
|
} else {
|
|
|
|
$_message ||= get_commit_entry($tb)->{log};
|
|
|
|
}
|
|
|
|
my $ra ||= Git::SVN::Ra->new($url);
|
|
|
|
my $r = $_revision;
|
|
|
|
if ($r eq 'HEAD') {
|
|
|
|
$r = $ra->get_latest_revnum;
|
|
|
|
} elsif ($r !~ /^\d+$/) {
|
|
|
|
die "revision argument: $r not understood by git-svn\n";
|
|
|
|
}
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my %ed_opts = ( r => $r,
|
|
|
|
ra => $ra->dup,
|
|
|
|
svn_path => $ra->{svn_path} );
|
|
|
|
my $ed = SVN::Git::Editor->new(\%ed_opts,
|
|
|
|
$ra->get_commit_editor($_message,
|
|
|
|
sub { print "Committed r$_[0]\n" }),
|
|
|
|
$pool);
|
|
|
|
my $mods = $ed->apply_diff($ta, $tb);
|
|
|
|
if (@$mods == 0) {
|
|
|
|
print "No changes\n$ta == $tb\n";
|
|
|
|
}
|
|
|
|
$pool->clear;
|
|
|
|
}
|
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
########################### utility functions #########################
|
|
|
|
|
2007-01-05 05:02:00 +03:00
|
|
|
sub complete_svn_url {
|
|
|
|
my ($url, $path) = @_;
|
|
|
|
$path =~ s#/+$##;
|
|
|
|
if ($path !~ m#^[a-z\+]+://#) {
|
|
|
|
if (!defined $url || $url !~ m#^[a-z\+]+://#) {
|
|
|
|
fatal("E: '$path' is not a complete URL ",
|
|
|
|
"and a separate URL is not specified\n");
|
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
return ($url, $path);
|
2007-01-05 05:02:00 +03:00
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
return ($path, '');
|
2007-01-05 05:02:00 +03:00
|
|
|
}
|
|
|
|
|
2006-06-13 02:53:13 +04:00
|
|
|
sub complete_url_ls_init {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($ra, $repo_path, $switch, $pfx) = @_;
|
|
|
|
unless ($repo_path) {
|
2006-06-13 02:53:13 +04:00
|
|
|
print STDERR "W: $switch not specified\n";
|
|
|
|
return;
|
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
$repo_path =~ s#/+$##;
|
|
|
|
if ($repo_path =~ m#^[a-z\+]+://#) {
|
|
|
|
$ra = Git::SVN::Ra->new($repo_path);
|
|
|
|
$repo_path = '';
|
2007-01-12 04:09:26 +03:00
|
|
|
} else {
|
2007-01-19 04:50:01 +03:00
|
|
|
$repo_path =~ s#^/+##;
|
2007-01-12 04:09:26 +03:00
|
|
|
unless ($ra) {
|
2007-01-19 04:50:01 +03:00
|
|
|
fatal("E: '$repo_path' is not a complete URL ",
|
2007-01-12 04:09:26 +03:00
|
|
|
"and a separate URL is not specified\n");
|
2007-01-12 02:35:55 +03:00
|
|
|
}
|
2007-01-12 04:09:26 +03:00
|
|
|
}
|
|
|
|
my $r = defined $_revision ? $_revision : $ra->get_latest_revnum;
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($dirent, undef, undef) = $ra->get_dir($repo_path, $r);
|
|
|
|
my $url = $ra->{url};
|
2007-01-12 04:09:26 +03:00
|
|
|
foreach my $d (sort keys %$dirent) {
|
|
|
|
next if ($dirent->{$d}->kind != $SVN::Node::dir);
|
2007-01-19 04:50:01 +03:00
|
|
|
my $path = "$repo_path/$d";
|
|
|
|
my $ref = "$pfx$d";
|
|
|
|
my $gs = eval { Git::SVN->new($ref) };
|
2007-01-12 04:09:26 +03:00
|
|
|
# don't try to init already existing refs
|
2007-01-12 02:35:55 +03:00
|
|
|
unless ($gs) {
|
2007-01-19 04:50:01 +03:00
|
|
|
print "init $url/$path => $ref\n";
|
|
|
|
Git::SVN->init($url, $path, undef, $ref);
|
2006-06-13 02:53:13 +04:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2006-12-15 21:59:54 +03:00
|
|
|
sub verify_ref {
|
|
|
|
my ($ref) = @_;
|
2006-12-28 12:16:21 +03:00
|
|
|
eval { command_oneline([ 'rev-parse', '--verify', $ref ],
|
|
|
|
{ STDERR => 0 }); };
|
2006-12-15 21:59:54 +03:00
|
|
|
}
|
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
sub get_tree_from_treeish {
|
2006-02-20 21:57:28 +03:00
|
|
|
my ($treeish) = @_;
|
2007-01-14 09:35:53 +03:00
|
|
|
# $treeish can be a symbolic ref, too:
|
2006-12-15 21:59:54 +03:00
|
|
|
my $type = command_oneline(qw/cat-file -t/, $treeish);
|
2006-02-20 21:57:28 +03:00
|
|
|
my $expected;
|
|
|
|
while ($type eq 'tag') {
|
2006-12-15 21:59:54 +03:00
|
|
|
($treeish, $type) = command(qw/cat-file tag/, $treeish);
|
2006-02-20 21:57:28 +03:00
|
|
|
}
|
|
|
|
if ($type eq 'commit') {
|
2006-12-15 21:59:54 +03:00
|
|
|
$expected = (grep /^tree /, command(qw/cat-file commit/,
|
|
|
|
$treeish))[0];
|
2007-01-14 09:35:53 +03:00
|
|
|
($expected) = ($expected =~ /^tree ($sha1)$/o);
|
2006-02-20 21:57:28 +03:00
|
|
|
die "Unable to get tree from $treeish\n" unless $expected;
|
|
|
|
} elsif ($type eq 'tree') {
|
|
|
|
$expected = $treeish;
|
|
|
|
} else {
|
|
|
|
die "$treeish is a $type, expected tree, tag or commit\n";
|
|
|
|
}
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
return $expected;
|
|
|
|
}
|
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
sub get_commit_entry {
|
|
|
|
my ($treeish) = shift;
|
|
|
|
my %log_entry = ( log => '', tree => get_tree_from_treeish($treeish) );
|
|
|
|
my $commit_editmsg = "$ENV{GIT_DIR}/COMMIT_EDITMSG";
|
|
|
|
my $commit_msg = "$ENV{GIT_DIR}/COMMIT_MSG";
|
|
|
|
open my $log_fh, '>', $commit_editmsg or croak $!;
|
2006-02-16 12:24:16 +03:00
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
my $type = command_oneline(qw/cat-file -t/, $treeish);
|
2006-07-10 07:20:48 +04:00
|
|
|
if ($type eq 'commit' || $type eq 'tag') {
|
2006-12-15 21:59:54 +03:00
|
|
|
my ($msg_fh, $ctx) = command_output_pipe('cat-file',
|
2007-01-14 09:35:53 +03:00
|
|
|
$type, $treeish);
|
2006-02-16 12:24:16 +03:00
|
|
|
my $in_msg = 0;
|
|
|
|
while (<$msg_fh>) {
|
|
|
|
if (!$in_msg) {
|
|
|
|
$in_msg = 1 if (/^\s*$/);
|
2006-03-03 12:20:08 +03:00
|
|
|
} elsif (/^git-svn-id: /) {
|
2007-01-14 09:35:53 +03:00
|
|
|
# skip this for now, we regenerate the
|
|
|
|
# correct one on re-fetch anyways
|
|
|
|
# TODO: set *:merge properties or like...
|
2006-02-16 12:24:16 +03:00
|
|
|
} else {
|
2007-01-14 09:35:53 +03:00
|
|
|
print $log_fh $_ or croak $!;
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
|
|
|
}
|
2006-12-15 21:59:54 +03:00
|
|
|
command_close_pipe($msg_fh, $ctx);
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2007-01-14 09:35:53 +03:00
|
|
|
close $log_fh or croak $!;
|
2006-02-16 12:24:16 +03:00
|
|
|
|
|
|
|
if ($_edit || ($type eq 'tree')) {
|
|
|
|
my $editor = $ENV{VISUAL} || $ENV{EDITOR} || 'vi';
|
2007-01-14 09:35:53 +03:00
|
|
|
# TODO: strip out spaces, comments, like git-commit.sh
|
|
|
|
system($editor, $commit_editmsg);
|
2006-02-16 12:24:16 +03:00
|
|
|
}
|
2007-01-14 09:35:53 +03:00
|
|
|
rename $commit_editmsg, $commit_msg or croak $!;
|
|
|
|
open $log_fh, '<', $commit_msg or croak $!;
|
|
|
|
{ local $/; chomp($log_entry{log} = <$log_fh>); }
|
|
|
|
close $log_fh or croak $!;
|
|
|
|
unlink $commit_msg;
|
|
|
|
\%log_entry;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
sub s_to_file {
|
|
|
|
my ($str, $file, $mode) = @_;
|
|
|
|
open my $fd,'>',$file or croak $!;
|
|
|
|
print $fd $str,"\n" or croak $!;
|
|
|
|
close $fd or croak $!;
|
|
|
|
chmod ($mode &~ umask, $file) if (defined $mode);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub file_to_s {
|
|
|
|
my $file = shift;
|
|
|
|
open my $fd,'<',$file or croak "$!: file: $file\n";
|
|
|
|
local $/;
|
|
|
|
my $ret = <$fd>;
|
|
|
|
close $fd or croak $!;
|
|
|
|
$ret =~ s/\s*$//s;
|
|
|
|
return $ret;
|
|
|
|
}
|
|
|
|
|
2006-03-03 12:20:08 +03:00
|
|
|
# '<svn username> = real-name <email address>' mapping based on git-svnimport:
|
|
|
|
sub load_authors {
|
|
|
|
open my $authors, '<', $_authors or die "Can't open $_authors $!\n";
|
2007-01-12 13:35:20 +03:00
|
|
|
my $log = $cmd eq 'log';
|
2006-03-03 12:20:08 +03:00
|
|
|
while (<$authors>) {
|
|
|
|
chomp;
|
2006-09-25 07:04:55 +04:00
|
|
|
next unless /^(\S+?|\(no author\))\s*=\s*(.+?)\s*<(.+)>\s*$/;
|
2006-03-03 12:20:08 +03:00
|
|
|
my ($user, $name, $email) = ($1, $2, $3);
|
2007-01-12 13:35:20 +03:00
|
|
|
if ($log) {
|
|
|
|
$Git::SVN::Log::rusers{"$name <$email>"} = $user;
|
|
|
|
} else {
|
|
|
|
$users{$user} = [$name, $email];
|
|
|
|
}
|
2006-06-01 13:35:44 +04:00
|
|
|
}
|
|
|
|
close $authors or croak $!;
|
|
|
|
}
|
|
|
|
|
2007-01-29 03:16:53 +03:00
|
|
|
# convert GetOpt::Long specs for use by git-config
|
2006-05-24 12:40:37 +04:00
|
|
|
sub read_repo_config {
|
2007-01-19 04:50:01 +03:00
|
|
|
return unless -d $ENV{GIT_DIR};
|
2006-05-24 12:40:37 +04:00
|
|
|
my $opts = shift;
|
|
|
|
foreach my $o (keys %$opts) {
|
|
|
|
my $v = $opts->{$o};
|
|
|
|
my ($key) = ($o =~ /^([a-z\-]+)/);
|
|
|
|
$key =~ s/-//g;
|
2007-01-29 03:16:53 +03:00
|
|
|
my $arg = 'git-config';
|
2006-05-24 12:40:37 +04:00
|
|
|
$arg .= ' --int' if ($o =~ /[:=]i$/);
|
|
|
|
$arg .= ' --bool' if ($o !~ /[:=][sfi]$/);
|
|
|
|
if (ref $v eq 'ARRAY') {
|
|
|
|
chomp(my @tmp = `$arg --get-all svn.$key`);
|
|
|
|
@$v = @tmp if @tmp;
|
|
|
|
} else {
|
|
|
|
chomp(my $tmp = `$arg --get svn.$key`);
|
2007-02-11 08:07:12 +03:00
|
|
|
if ($tmp && !($arg =~ / --bool/ && $tmp eq 'false')) {
|
2006-05-24 12:40:37 +04:00
|
|
|
$$v = $tmp;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2006-06-01 13:35:44 +04:00
|
|
|
sub extract_metadata {
|
2006-06-28 06:39:11 +04:00
|
|
|
my $id = shift or return (undef, undef, undef);
|
2006-06-01 13:35:44 +04:00
|
|
|
my ($url, $rev, $uuid) = ($id =~ /^git-svn-id:\s(\S+?)\@(\d+)
|
|
|
|
\s([a-f\d\-]+)$/x);
|
2006-11-24 01:54:04 +03:00
|
|
|
if (!defined $rev || !$uuid || !$url) {
|
2006-06-01 13:35:44 +04:00
|
|
|
# some of the original repositories I made had
|
2006-07-10 09:50:18 +04:00
|
|
|
# identifiers like this:
|
2006-06-01 13:35:44 +04:00
|
|
|
($rev, $uuid) = ($id =~/^git-svn-id:\s(\d+)\@([a-f\d\-]+)/);
|
|
|
|
}
|
|
|
|
return ($url, $rev, $uuid);
|
|
|
|
}
|
|
|
|
|
2006-06-28 06:39:11 +04:00
|
|
|
sub cmt_metadata {
|
|
|
|
return extract_metadata((grep(/^git-svn-id: /,
|
2006-12-15 21:59:54 +03:00
|
|
|
command(qw/cat-file commit/, shift)))[-1]);
|
2006-06-28 06:39:11 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
sub get_commit_time {
|
|
|
|
my $cmt = shift;
|
2006-12-15 21:59:54 +03:00
|
|
|
my $fh = command_output_pipe(qw/rev-list --pretty=raw -n1/, $cmt);
|
2006-06-28 06:39:11 +04:00
|
|
|
while (<$fh>) {
|
|
|
|
/^committer\s(?:.+) (\d+) ([\-\+]?\d+)$/ or next;
|
|
|
|
my ($s, $tz) = ($1, $2);
|
|
|
|
if ($tz =~ s/^\+//) {
|
|
|
|
$s += tz_to_s_offset($tz);
|
|
|
|
} elsif ($tz =~ s/^\-//) {
|
|
|
|
$s -= tz_to_s_offset($tz);
|
|
|
|
}
|
2007-02-02 00:12:26 +03:00
|
|
|
close $fh;
|
2006-06-28 06:39:11 +04:00
|
|
|
return $s;
|
|
|
|
}
|
|
|
|
die "Can't get commit time for commit: $cmt\n";
|
|
|
|
}
|
|
|
|
|
2006-06-01 13:35:44 +04:00
|
|
|
sub tz_to_s_offset {
|
|
|
|
my ($tz) = @_;
|
|
|
|
$tz =~ s/(\d\d)$//;
|
|
|
|
return ($1 * 60) + ($tz * 3600);
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
package Git::SVN;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
2007-01-19 04:50:01 +03:00
|
|
|
use vars qw/$default_repo_id/;
|
2007-01-11 23:14:21 +03:00
|
|
|
use Carp qw/croak/;
|
|
|
|
use File::Path qw/mkpath/;
|
|
|
|
use IPC::Open3;
|
|
|
|
|
|
|
|
# properties that we do not log:
|
|
|
|
my %SKIP_PROP;
|
|
|
|
BEGIN {
|
|
|
|
%SKIP_PROP = map { $_ => 1 } qw/svn:wc:ra_dav:version-url
|
|
|
|
svn:special svn:executable
|
|
|
|
svn:entry:committed-rev
|
|
|
|
svn:entry:last-author
|
|
|
|
svn:entry:uuid
|
|
|
|
svn:entry:committed-date/;
|
|
|
|
}
|
|
|
|
|
2007-01-21 15:27:09 +03:00
|
|
|
sub read_all_remotes {
|
|
|
|
my $r = {};
|
|
|
|
foreach (grep { s/^svn-remote\.// } command(qw/repo-config -l/)) {
|
|
|
|
if (m!^(.+)\.fetch=\s*(.*)\s*:\s*refs/remotes/(.+)\s*$!) {
|
|
|
|
$r->{$1}->{fetch}->{$2} = $3;
|
|
|
|
} elsif (m!^(.+)\.url=\s*(.*)\s*$!) {
|
|
|
|
$r->{$1}->{url} = $2;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
$r;
|
|
|
|
}
|
|
|
|
|
|
|
|
# we allow more chars than remotes2config.sh...
|
2007-01-19 04:50:01 +03:00
|
|
|
sub sanitize_remote_name {
|
|
|
|
my ($name) = @_;
|
2007-01-21 15:27:09 +03:00
|
|
|
$name =~ tr{A-Za-z0-9:,/+-}{.}c;
|
2007-01-19 04:50:01 +03:00
|
|
|
$name;
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub init {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($class, $url, $path, $repo_id, $ref_id) = @_;
|
|
|
|
my $self = _new($class, $repo_id, $ref_id, $path);
|
|
|
|
mkpath([$self->{dir}]);
|
2007-01-11 23:14:21 +03:00
|
|
|
if (defined $url) {
|
|
|
|
$url =~ s!/+$!!; # strip trailing slash
|
2007-01-19 04:50:01 +03:00
|
|
|
my $orig_url = eval {
|
|
|
|
command_oneline('config', '--get',
|
|
|
|
"svn-remote.$repo_id.url")
|
|
|
|
};
|
|
|
|
if ($orig_url) {
|
|
|
|
if ($orig_url ne $url) {
|
|
|
|
die "svn-remote.$repo_id.url already set: ",
|
|
|
|
"$orig_url\nwanted to set to: $url\n";
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
command_noisy('config',
|
|
|
|
"svn-remote.$repo_id.url", $url);
|
|
|
|
}
|
|
|
|
command_noisy('config', '--add',
|
|
|
|
"svn-remote.$repo_id.fetch",
|
|
|
|
"$path:".$self->refname);
|
2007-01-11 23:14:21 +03:00
|
|
|
}
|
|
|
|
$self->{url} = $url;
|
2007-01-19 04:50:01 +03:00
|
|
|
unless (-f $self->{db_path}) {
|
|
|
|
open my $fh, '>>', $self->{db_path} or croak $!;
|
|
|
|
close $fh or croak $!;
|
|
|
|
}
|
2007-01-11 23:14:21 +03:00
|
|
|
$self;
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
sub find_ref {
|
|
|
|
my ($ref_id) = @_;
|
|
|
|
foreach (command(qw/config -l/)) {
|
|
|
|
next unless m!^svn-remote\.(.+)\.fetch=
|
|
|
|
\s*(.*)\s*:\s*refs/remotes/(.+)\s*$!x;
|
|
|
|
my ($repo_id, $path, $ref) = ($1, $2, $3);
|
|
|
|
if ($ref eq $ref_id) {
|
|
|
|
$path = '' if ($path =~ m#^\./?#);
|
|
|
|
return ($repo_id, $path);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
(undef, undef, undef);
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub new {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($class, $ref_id, $repo_id, $path) = @_;
|
|
|
|
if (defined $ref_id && !defined $repo_id && !defined $path) {
|
|
|
|
($repo_id, $path) = find_ref($ref_id);
|
|
|
|
if (!defined $repo_id) {
|
|
|
|
die "Could not find a \"svn-remote.*.fetch\" key ",
|
|
|
|
"in the repository configuration matching: ",
|
|
|
|
"refs/remotes/$ref_id\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my $self = _new($class, $repo_id, $ref_id, $path);
|
|
|
|
$self->{url} = command_oneline('config', '--get',
|
|
|
|
"svn-remote.$repo_id.url") or
|
|
|
|
die "Failed to read \"svn-remote.$repo_id.url\" in config\n";
|
2007-01-11 23:14:21 +03:00
|
|
|
$self;
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
sub refname { "refs/remotes/$_[0]->{ref_id}" }
|
2007-01-11 23:14:21 +03:00
|
|
|
|
|
|
|
sub ra {
|
|
|
|
my ($self) = shift;
|
|
|
|
$self->{ra} ||= Git::SVN::Ra->new($self->{url});
|
|
|
|
}
|
|
|
|
|
2007-01-22 13:20:33 +03:00
|
|
|
sub rel_path {
|
|
|
|
my ($self) = @_;
|
|
|
|
my $repos_root = $self->ra->{repos_root};
|
|
|
|
return $self->{path} if ($self->{url} eq $repos_root);
|
|
|
|
my $url = $self->{url} .
|
|
|
|
(length $self->{path} ? "/$self->{path}" : $self->{path});
|
|
|
|
$url =~ s!^\Q$repos_root\E/*!!g;
|
|
|
|
$url;
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub copy_remote_ref {
|
|
|
|
my ($self) = @_;
|
|
|
|
my $origin = $::_cp_remote ? $::_cp_remote : 'origin';
|
|
|
|
my $ref = $self->refname;
|
|
|
|
if (command('ls-remote', $origin, $ref)) {
|
|
|
|
command_noisy('fetch', $origin, "$ref:$ref");
|
|
|
|
} elsif ($::_cp_remote && !$::_upgrade) {
|
|
|
|
die "Unable to find remote reference: $ref on $origin\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub traverse_ignore {
|
|
|
|
my ($self, $fh, $path, $r) = @_;
|
|
|
|
$path =~ s#^/+##g;
|
|
|
|
my ($dirent, undef, $props) = $self->ra->get_dir($path, $r);
|
|
|
|
my $p = $path;
|
|
|
|
$p =~ s#^\Q$self->{ra}->{svn_path}\E/##;
|
|
|
|
print $fh length $p ? "\n# $p\n" : "\n# /\n";
|
|
|
|
if (my $s = $props->{'svn:ignore'}) {
|
|
|
|
$s =~ s/[\r\n]+/\n/g;
|
|
|
|
chomp $s;
|
|
|
|
if (length $p == 0) {
|
|
|
|
$s =~ s#\n#\n/$p#g;
|
|
|
|
print $fh "/$s\n";
|
|
|
|
} else {
|
|
|
|
$s =~ s#\n#\n/$p/#g;
|
|
|
|
print $fh "/$p/$s\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
foreach (sort keys %$dirent) {
|
|
|
|
next if $dirent->{$_}->kind != $SVN::Node::dir;
|
|
|
|
$self->traverse_ignore($fh, "$path/$_", $r);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
# returns the newest SVN revision number and newest commit SHA1
|
|
|
|
sub last_rev_commit {
|
|
|
|
my ($self) = @_;
|
|
|
|
if (defined $self->{last_rev} && defined $self->{last_commit}) {
|
|
|
|
return ($self->{last_rev}, $self->{last_commit});
|
|
|
|
}
|
2007-01-11 23:26:16 +03:00
|
|
|
my $c = ::verify_ref($self->refname.'^0');
|
2007-01-19 04:50:01 +03:00
|
|
|
if ($c) {
|
2007-01-11 23:26:16 +03:00
|
|
|
my $rev = (::cmt_metadata($c))[1];
|
2007-01-11 23:14:21 +03:00
|
|
|
if (defined $rev) {
|
|
|
|
($self->{last_rev}, $self->{last_commit}) = ($rev, $c);
|
|
|
|
return ($rev, $c);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my $offset = -41; # from tail
|
|
|
|
my $rl;
|
|
|
|
open my $fh, '<', $self->{db_path} or
|
|
|
|
croak "$self->{db_path} not readable: $!\n";
|
|
|
|
seek $fh, $offset, 2;
|
|
|
|
$rl = readline $fh;
|
|
|
|
defined $rl or return (undef, undef);
|
|
|
|
chomp $rl;
|
|
|
|
while ($c ne $rl && tell $fh != 0) {
|
|
|
|
$offset -= 41;
|
|
|
|
seek $fh, $offset, 2;
|
|
|
|
$rl = readline $fh;
|
|
|
|
defined $rl or return (undef, undef);
|
|
|
|
chomp $rl;
|
|
|
|
}
|
|
|
|
my $rev = tell $fh;
|
|
|
|
croak $! if ($rev < 0);
|
|
|
|
$rev = ($rev - 41) / 41;
|
|
|
|
close $fh or croak $!;
|
|
|
|
($self->{last_rev}, $self->{last_commit}) = ($rev, $c);
|
|
|
|
return ($rev, $c);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub parse_revision {
|
|
|
|
my ($self, $base) = @_;
|
|
|
|
my $head = $self->ra->get_latest_revnum;
|
|
|
|
if (!defined $::_revision || $::_revision eq 'BASE:HEAD') {
|
|
|
|
return ($base + 1, $head) if (defined $base);
|
|
|
|
return (0, $head);
|
|
|
|
}
|
|
|
|
return ($1, $2) if ($::_revision =~ /^(\d+):(\d+)$/);
|
|
|
|
return ($::_revision, $::_revision) if ($::_revision =~ /^\d+$/);
|
|
|
|
if ($::_revision =~ /^BASE:(\d+)$/) {
|
|
|
|
return ($base + 1, $1) if (defined $base);
|
|
|
|
return (0, $head);
|
|
|
|
}
|
|
|
|
return ($1, $head) if ($::_revision =~ /^(\d+):HEAD$/);
|
|
|
|
die "revision argument: $::_revision not understood by git-svn\n",
|
|
|
|
"Try using the command-line svn client instead\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
sub tmp_index_do {
|
|
|
|
my ($self, $sub) = @_;
|
|
|
|
my $old_index = $ENV{GIT_INDEX_FILE};
|
|
|
|
$ENV{GIT_INDEX_FILE} = $self->{index};
|
|
|
|
my @ret = &$sub;
|
|
|
|
if ($old_index) {
|
|
|
|
$ENV{GIT_INDEX_FILE} = $old_index;
|
|
|
|
} else {
|
|
|
|
delete $ENV{GIT_INDEX_FILE};
|
|
|
|
}
|
|
|
|
wantarray ? @ret : $ret[0];
|
|
|
|
}
|
|
|
|
|
|
|
|
sub assert_index_clean {
|
|
|
|
my ($self, $treeish) = @_;
|
|
|
|
|
|
|
|
$self->tmp_index_do(sub {
|
|
|
|
command_noisy('read-tree', $treeish) unless -e $self->{index};
|
|
|
|
my $x = command_oneline('write-tree');
|
|
|
|
my ($y) = (command(qw/cat-file commit/, $treeish) =~
|
|
|
|
/^tree ($::sha1)/mo);
|
|
|
|
if ($y ne $x) {
|
|
|
|
unlink $self->{index} or croak $!;
|
|
|
|
command_noisy('read-tree', $treeish);
|
|
|
|
}
|
|
|
|
$x = command_oneline('write-tree');
|
|
|
|
if ($y ne $x) {
|
|
|
|
::fatal "trees ($treeish) $y != $x\n",
|
|
|
|
"Something is seriously wrong...\n";
|
|
|
|
}
|
|
|
|
});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub get_commit_parents {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $log_entry, @parents) = @_;
|
2007-01-11 23:14:21 +03:00
|
|
|
my (%seen, @ret, @tmp);
|
|
|
|
# commit parents can be conditionally bound to a particular
|
|
|
|
# svn revision via: "svn_revno=commit_sha1", filter them out here:
|
|
|
|
foreach my $p (@parents) {
|
|
|
|
next unless defined $p;
|
|
|
|
if ($p =~ /^(\d+)=($::sha1_short)$/o) {
|
2007-01-14 09:35:53 +03:00
|
|
|
push @tmp, $2 if $1 == $log_entry->{revision};
|
2007-01-11 23:14:21 +03:00
|
|
|
} else {
|
|
|
|
push @tmp, $p if $p =~ /^$::sha1_short$/o;
|
|
|
|
}
|
|
|
|
}
|
2007-01-11 23:26:16 +03:00
|
|
|
if (my $cur = ::verify_ref($self->refname.'^0')) {
|
2007-01-11 23:14:21 +03:00
|
|
|
push @tmp, $cur;
|
|
|
|
}
|
2007-01-14 09:35:53 +03:00
|
|
|
push @tmp, $_ foreach (@{$log_entry->{parents}}, @tmp);
|
2007-01-11 23:14:21 +03:00
|
|
|
while (my $p = shift @tmp) {
|
|
|
|
next if $seen{$p};
|
|
|
|
$seen{$p} = 1;
|
|
|
|
push @ret, $p;
|
|
|
|
# MAXPARENT is defined to 16 in commit-tree.c:
|
|
|
|
last if @ret >= 16;
|
|
|
|
}
|
|
|
|
if (@tmp) {
|
2007-01-14 09:35:53 +03:00
|
|
|
die "r$log_entry->{revision}: No room for parents:\n\t",
|
2007-01-11 23:14:21 +03:00
|
|
|
join("\n\t", @tmp), "\n";
|
|
|
|
}
|
|
|
|
@ret;
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
sub full_url {
|
2007-01-11 23:14:21 +03:00
|
|
|
my ($self) = @_;
|
2007-01-19 04:50:01 +03:00
|
|
|
$self->ra->{url} . (length $self->{path} ? '/' . $self->{path} : '');
|
2007-01-11 23:14:21 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
sub do_git_commit {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $log_entry, @parents) = @_;
|
|
|
|
if (my $c = $self->rev_db_get($log_entry->{revision})) {
|
|
|
|
croak "$log_entry->{revision} = $c already exists! ",
|
2007-01-11 23:14:21 +03:00
|
|
|
"Why are we refetching it?\n";
|
|
|
|
}
|
2007-01-14 13:17:00 +03:00
|
|
|
my $author = $log_entry->{author};
|
|
|
|
my ($name, $email) = (defined $::users{$author} ? @{$::users{$author}}
|
|
|
|
: ($author, "$author\@".$self->ra->uuid));
|
2007-01-11 23:14:21 +03:00
|
|
|
$ENV{GIT_AUTHOR_NAME} = $ENV{GIT_COMMITTER_NAME} = $name;
|
|
|
|
$ENV{GIT_AUTHOR_EMAIL} = $ENV{GIT_COMMITTER_EMAIL} = $email;
|
2007-01-14 09:35:53 +03:00
|
|
|
$ENV{GIT_AUTHOR_DATE} = $ENV{GIT_COMMITTER_DATE} = $log_entry->{date};
|
2007-01-11 23:14:21 +03:00
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
my $tree = $log_entry->{tree};
|
2007-01-11 23:14:21 +03:00
|
|
|
if (!defined $tree) {
|
|
|
|
$tree = $self->tmp_index_do(sub {
|
|
|
|
command_oneline('write-tree') });
|
|
|
|
}
|
|
|
|
die "Tree is not a valid sha1: $tree\n" if $tree !~ /^$::sha1$/o;
|
|
|
|
|
|
|
|
my @exec = ('git-commit-tree', $tree);
|
2007-01-14 09:35:53 +03:00
|
|
|
foreach ($self->get_commit_parents($log_entry, @parents)) {
|
2007-01-11 23:14:21 +03:00
|
|
|
push @exec, '-p', $_;
|
|
|
|
}
|
|
|
|
defined(my $pid = open3(my $msg_fh, my $out_fh, '>&STDERR', @exec))
|
|
|
|
or croak $!;
|
2007-01-14 09:35:53 +03:00
|
|
|
print $msg_fh $log_entry->{log} or croak $!;
|
2007-01-19 04:50:01 +03:00
|
|
|
print $msg_fh "\ngit-svn-id: ", $self->full_url, '@',
|
2007-01-14 09:35:53 +03:00
|
|
|
$log_entry->{revision}, ' ',
|
|
|
|
$self->ra->uuid, "\n" or croak $!;
|
2007-01-11 23:14:21 +03:00
|
|
|
$msg_fh->flush == 0 or croak $!;
|
|
|
|
close $msg_fh or croak $!;
|
|
|
|
chomp(my $commit = do { local $/; <$out_fh> });
|
|
|
|
close $out_fh or croak $!;
|
|
|
|
waitpid $pid, 0;
|
|
|
|
croak $? if $?;
|
|
|
|
if ($commit !~ /^$::sha1$/o) {
|
|
|
|
die "Failed to commit, invalid sha1: $commit\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
command_noisy('update-ref',$self->refname, $commit);
|
2007-01-14 09:35:53 +03:00
|
|
|
$self->rev_db_set($log_entry->{revision}, $commit);
|
2007-01-11 23:14:21 +03:00
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
$self->{last_rev} = $log_entry->{revision};
|
2007-01-11 23:14:21 +03:00
|
|
|
$self->{last_commit} = $commit;
|
2007-01-14 09:35:53 +03:00
|
|
|
print "r$log_entry->{revision} = $commit\n";
|
2007-01-11 23:14:21 +03:00
|
|
|
return $commit;
|
|
|
|
}
|
|
|
|
|
2007-01-22 13:20:33 +03:00
|
|
|
sub revisions_eq {
|
|
|
|
my ($self, $r0, $r1) = @_;
|
|
|
|
return 1 if $r0 == $r1;
|
|
|
|
my $nr = 0;
|
|
|
|
$self->ra->get_log([$self->{path}], $r0, $r1,
|
|
|
|
0, 0, 1, sub { $nr++ });
|
|
|
|
return 0 if ($nr > 1);
|
|
|
|
return 1;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub find_parent_branch {
|
|
|
|
my ($self, $paths, $rev) = @_;
|
|
|
|
|
|
|
|
# look for a parent from another branch:
|
|
|
|
my $i = $paths->{'/'.$self->rel_path} or return;
|
|
|
|
my $branch_from = $i->copyfrom_path or return;
|
|
|
|
my $r = $i->copyfrom_rev;
|
|
|
|
my $repos_root = $self->ra->{repos_root};
|
|
|
|
my $url = $self->ra->{url};
|
|
|
|
my $new_url = $repos_root . $branch_from;
|
|
|
|
print STDERR "Found possible branch point: ",
|
|
|
|
"$new_url => ", $self->full_url, ", $r\n";
|
|
|
|
$branch_from =~ s#^/##;
|
|
|
|
my $remotes = read_all_remotes();
|
|
|
|
my $gs;
|
|
|
|
foreach my $repo_id (keys %$remotes) {
|
|
|
|
my $u = $remotes->{$repo_id}->{url} or next;
|
|
|
|
next if $url ne $u;
|
|
|
|
my $fetch = $remotes->{$repo_id}->{fetch};
|
|
|
|
foreach my $f (keys %$fetch) {
|
|
|
|
next if $f ne $branch_from;
|
|
|
|
$gs = Git::SVN->new($fetch->{$f}, $repo_id, $f);
|
|
|
|
last;
|
|
|
|
}
|
|
|
|
last if $gs;
|
|
|
|
}
|
|
|
|
unless ($gs) {
|
|
|
|
my $ref_id = $branch_from;
|
|
|
|
$ref_id .= "\@$r" if find_ref($ref_id);
|
|
|
|
# just grow a tail if we're not unique enough :x
|
|
|
|
$ref_id .= '-' while find_ref($ref_id);
|
|
|
|
$gs = Git::SVN->init($new_url, '', $ref_id, $ref_id);
|
|
|
|
}
|
|
|
|
my ($r0, $parent) = $gs->find_rev_before($r, 1);
|
|
|
|
if ($::_follow_parent && (!defined $r0 || !defined $parent)) {
|
|
|
|
foreach (0 .. $r) {
|
|
|
|
my $log_entry = eval { $gs->do_fetch(undef, $_) };
|
|
|
|
$gs->do_git_commit($log_entry) if $log_entry;
|
|
|
|
}
|
|
|
|
($r0, $parent) = $gs->last_rev_commit;
|
|
|
|
}
|
|
|
|
if (defined $r0 && defined $parent && $gs->revisions_eq($r0, $r)) {
|
|
|
|
print STDERR "Found branch parent: ($self->{ref_id}) $parent\n";
|
|
|
|
command_noisy('read-tree', $parent);
|
|
|
|
my $ed;
|
|
|
|
if ($self->ra->can_do_switch) {
|
|
|
|
# do_switch works with svn/trunk >= r22312, but that
|
|
|
|
# is not included with SVN 1.4.2 (the latest version
|
|
|
|
# at the moment), so we can't rely on it
|
|
|
|
$self->{last_commit} = $parent;
|
|
|
|
$ed = SVN::Git::Fetcher->new($self);
|
|
|
|
$gs->ra->gs_do_switch($r0, $rev, $gs->{path}, 1,
|
|
|
|
$self->full_url, $ed)
|
|
|
|
or die "SVN connection failed somewhere...\n";
|
|
|
|
} else {
|
|
|
|
$ed = SVN::Git::Fetcher->new($self);
|
|
|
|
$self->ra->gs_do_update($rev, $rev, $self->{path},
|
|
|
|
1, $ed)
|
|
|
|
or die "SVN connection failed somewhere...\n";
|
|
|
|
}
|
|
|
|
return $self->make_log_entry($rev, [$parent], $ed);
|
|
|
|
}
|
|
|
|
print STDERR "Branch parent not found...\n";
|
|
|
|
return undef;
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub do_fetch {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($self, $paths, $rev) = @_;
|
2007-01-22 13:20:33 +03:00
|
|
|
my $ed;
|
2007-01-11 23:14:21 +03:00
|
|
|
my ($last_rev, @parents);
|
|
|
|
if ($self->{last_commit}) {
|
2007-01-22 13:20:33 +03:00
|
|
|
$ed = SVN::Git::Fetcher->new($self);
|
2007-01-11 23:14:21 +03:00
|
|
|
$last_rev = $self->{last_rev};
|
|
|
|
$ed->{c} = $self->{last_commit};
|
|
|
|
@parents = ($self->{last_commit});
|
|
|
|
} else {
|
|
|
|
$last_rev = $rev;
|
2007-01-22 13:20:33 +03:00
|
|
|
if (my $log_entry = $self->find_parent_branch($paths, $rev)) {
|
|
|
|
return $log_entry;
|
|
|
|
}
|
|
|
|
$ed = SVN::Git::Fetcher->new($self);
|
2007-01-11 23:14:21 +03:00
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
unless ($self->ra->gs_do_update($last_rev, $rev,
|
|
|
|
$self->{path}, 1, $ed)) {
|
2007-01-11 23:14:21 +03:00
|
|
|
die "SVN connection failed somewhere...\n";
|
|
|
|
}
|
|
|
|
$self->make_log_entry($rev, \@parents, $ed);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub write_untracked {
|
|
|
|
my ($self, $rev, $fh, $untracked) = @_;
|
|
|
|
my $h;
|
|
|
|
print $fh "r$rev\n" or croak $!;
|
|
|
|
$h = $untracked->{empty};
|
|
|
|
foreach (sort keys %$h) {
|
|
|
|
my $act = $h->{$_} ? '+empty_dir' : '-empty_dir';
|
|
|
|
print $fh " $act: ", uri_encode($_), "\n" or croak $!;
|
|
|
|
warn "W: $act: $_\n";
|
|
|
|
}
|
|
|
|
foreach my $t (qw/dir_prop file_prop/) {
|
|
|
|
$h = $untracked->{$t} or next;
|
|
|
|
foreach my $path (sort keys %$h) {
|
|
|
|
my $ppath = $path eq '' ? '.' : $path;
|
|
|
|
foreach my $prop (sort keys %{$h->{$path}}) {
|
2007-01-15 10:21:16 +03:00
|
|
|
next if $SKIP_PROP{$prop};
|
2007-01-11 23:14:21 +03:00
|
|
|
my $v = $h->{$path}->{$prop};
|
|
|
|
if (defined $v) {
|
|
|
|
print $fh " +$t: ",
|
|
|
|
uri_encode($ppath), ' ',
|
|
|
|
uri_encode($prop), ' ',
|
|
|
|
uri_encode($v), "\n"
|
|
|
|
or croak $!;
|
|
|
|
} else {
|
|
|
|
print $fh " -$t: ",
|
|
|
|
uri_encode($ppath), ' ',
|
|
|
|
uri_encode($prop), "\n"
|
|
|
|
or croak $!;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
foreach my $t (qw/absent_file absent_directory/) {
|
|
|
|
$h = $untracked->{$t} or next;
|
|
|
|
foreach my $parent (sort keys %$h) {
|
|
|
|
foreach my $path (sort @{$h->{$parent}}) {
|
|
|
|
print $fh " $t: ",
|
|
|
|
uri_encode("$parent/$path"), "\n"
|
|
|
|
or croak $!;
|
|
|
|
warn "W: $t: $parent/$path ",
|
|
|
|
"Insufficient permissions?\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-14 13:17:00 +03:00
|
|
|
sub parse_svn_date {
|
|
|
|
my $date = shift || return '+0000 1970-01-01 00:00:00';
|
|
|
|
my ($Y,$m,$d,$H,$M,$S) = ($date =~ /^(\d{4})\-(\d\d)\-(\d\d)T
|
|
|
|
(\d\d)\:(\d\d)\:(\d\d).\d+Z$/x) or
|
|
|
|
croak "Unable to parse date: $date\n";
|
|
|
|
"+0000 $Y-$m-$d $H:$M:$S";
|
|
|
|
}
|
|
|
|
|
|
|
|
sub check_author {
|
|
|
|
my ($author) = @_;
|
|
|
|
if (!defined $author || length $author == 0) {
|
|
|
|
$author = '(no author)';
|
|
|
|
}
|
|
|
|
if (defined $::_authors && ! defined $::users{$author}) {
|
|
|
|
die "Author: $author not defined in $::_authors file\n";
|
|
|
|
}
|
|
|
|
$author;
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub make_log_entry {
|
|
|
|
my ($self, $rev, $parents, $untracked) = @_;
|
|
|
|
my $rp = $self->ra->rev_proplist($rev);
|
|
|
|
my %log_entry = ( parents => $parents || [], revision => $rev,
|
|
|
|
revprops => $rp, log => '');
|
|
|
|
open my $un, '>>', "$self->{dir}/unhandled.log" or croak $!;
|
|
|
|
$self->write_untracked($rev, $un, $untracked);
|
|
|
|
foreach (sort keys %$rp) {
|
|
|
|
my $v = $rp->{$_};
|
|
|
|
if (/^svn:(author|date|log)$/) {
|
|
|
|
$log_entry{$1} = $v;
|
|
|
|
} else {
|
|
|
|
print $un " rev_prop: ", uri_encode($_), ' ',
|
|
|
|
uri_encode($v), "\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
close $un or croak $!;
|
|
|
|
$log_entry{date} = parse_svn_date($log_entry{date});
|
|
|
|
$log_entry{author} = check_author($log_entry{author});
|
|
|
|
$log_entry{log} .= "\n";
|
|
|
|
\%log_entry;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub fetch {
|
|
|
|
my ($self, @parents) = @_;
|
|
|
|
my ($last_rev, $last_commit) = $self->last_rev_commit;
|
|
|
|
my ($base, $head) = $self->parse_revision($last_rev);
|
|
|
|
return if ($base > $head);
|
|
|
|
if (defined $last_commit) {
|
|
|
|
$self->assert_index_clean($last_commit);
|
|
|
|
}
|
|
|
|
my $inc = 1000;
|
|
|
|
my ($min, $max) = ($base, $head < $base + $inc ? $head : $base + $inc);
|
|
|
|
my $err_handler = $SVN::Error::handler;
|
|
|
|
$SVN::Error::handler = \&skip_unknown_revs;
|
|
|
|
while (1) {
|
|
|
|
my @revs;
|
|
|
|
$self->ra->get_log([''], $min, $max, 0, 1, 1, sub {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($paths, $rev, $author, $date, $log) = @_;
|
2007-01-22 13:20:33 +03:00
|
|
|
push @revs, [ $paths, $rev ] });
|
2007-01-11 23:14:21 +03:00
|
|
|
foreach (@revs) {
|
2007-01-22 13:20:33 +03:00
|
|
|
my $log_entry = $self->do_fetch(@$_);
|
2007-01-11 23:14:21 +03:00
|
|
|
$self->do_git_commit($log_entry, @parents);
|
|
|
|
}
|
|
|
|
last if $max >= $head;
|
|
|
|
$min = $max + 1;
|
|
|
|
$max += $inc;
|
|
|
|
$max = $head if ($max > $head);
|
|
|
|
}
|
|
|
|
$SVN::Error::handler = $err_handler;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub set_tree_cb {
|
|
|
|
my ($self, $log_entry, $tree, $rev, $date, $author) = @_;
|
|
|
|
# TODO: enable and test optimized commits:
|
|
|
|
if (0 && $rev == ($self->{last_rev} + 1)) {
|
|
|
|
$log_entry->{revision} = $rev;
|
|
|
|
$log_entry->{author} = $author;
|
|
|
|
$self->do_git_commit($log_entry, "$rev=$tree");
|
|
|
|
} else {
|
|
|
|
$self->fetch("$rev=$tree");
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub set_tree {
|
|
|
|
my ($self, $tree) = (shift, shift);
|
2007-01-15 10:21:16 +03:00
|
|
|
my $log_entry = ::get_commit_entry($tree);
|
2007-01-11 23:14:21 +03:00
|
|
|
unless ($self->{last_rev}) {
|
|
|
|
fatal("Must have an existing revision to commit\n");
|
|
|
|
}
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my $ed = SVN::Git::Editor->new({ r => $self->{last_rev},
|
|
|
|
ra => $self->ra->dup,
|
|
|
|
svn_path => $self->ra->{svn_path}
|
|
|
|
},
|
|
|
|
$self->ra->get_commit_editor(
|
|
|
|
$log_entry->{log}, sub {
|
|
|
|
$self->set_tree_cb($log_entry,
|
|
|
|
$tree, @_);
|
|
|
|
}),
|
|
|
|
$pool);
|
|
|
|
my $mods = $ed->apply_diff($self->{last_commit}, $tree);
|
|
|
|
if (@$mods == 0) {
|
|
|
|
print "No changes\nr$self->{last_rev} = $tree\n";
|
|
|
|
}
|
|
|
|
$pool->clear;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub skip_unknown_revs {
|
|
|
|
my ($err) = @_;
|
|
|
|
my $errno = $err->apr_err();
|
|
|
|
# Maybe the branch we're tracking didn't
|
|
|
|
# exist when the repo started, so it's
|
|
|
|
# not an error if it doesn't, just continue
|
|
|
|
#
|
|
|
|
# Wonderfully consistent library, eh?
|
|
|
|
# 160013 - svn:// and file://
|
|
|
|
# 175002 - http(s)://
|
|
|
|
# 175007 - http(s):// (this repo required authorization, too...)
|
|
|
|
# More codes may be discovered later...
|
|
|
|
if ($errno == 175007 || $errno == 175002 || $errno == 160013) {
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
croak "Error from SVN, ($errno): ", $err->expanded_message,"\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
# rev_db:
|
|
|
|
# Tie::File seems to be prone to offset errors if revisions get sparse,
|
|
|
|
# it's not that fast, either. Tie::File is also not in Perl 5.6. So
|
|
|
|
# one of my favorite modules is out :< Next up would be one of the DBM
|
|
|
|
# modules, but I'm not sure which is most portable... So I'll just
|
|
|
|
# go with something that's plain-text, but still capable of
|
|
|
|
# being randomly accessed. So here's my ultra-simple fixed-width
|
|
|
|
# database. All records are 40 characters + "\n", so it's easy to seek
|
|
|
|
# to a revision: (41 * rev) is the byte offset.
|
|
|
|
# A record of 40 0s denotes an empty revision.
|
|
|
|
# And yes, it's still pretty fast (faster than Tie::File).
|
|
|
|
|
|
|
|
sub rev_db_set {
|
|
|
|
my ($self, $rev, $commit) = @_;
|
|
|
|
length $commit == 40 or croak "arg3 must be a full SHA1 hexsum\n";
|
|
|
|
open my $fh, '+<', $self->{db_path} or croak $!;
|
|
|
|
my $offset = $rev * 41;
|
|
|
|
# assume that append is the common case:
|
|
|
|
seek $fh, 0, 2 or croak $!;
|
|
|
|
my $pos = tell $fh;
|
|
|
|
if ($pos < $offset) {
|
|
|
|
print $fh (('0' x 40),"\n") x (($offset - $pos) / 41)
|
|
|
|
or croak $!;
|
|
|
|
}
|
|
|
|
seek $fh, $offset, 0 or croak $!;
|
|
|
|
print $fh $commit,"\n" or croak $!;
|
|
|
|
close $fh or croak $!;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub rev_db_get {
|
|
|
|
my ($self, $rev) = @_;
|
|
|
|
my $ret;
|
|
|
|
my $offset = $rev * 41;
|
|
|
|
open my $fh, '<', $self->{db_path} or croak $!;
|
|
|
|
if (seek $fh, $offset, 0) {
|
|
|
|
$ret = readline $fh;
|
|
|
|
if (defined $ret) {
|
|
|
|
chomp $ret;
|
|
|
|
$ret = undef if ($ret =~ /^0{40}$/);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
close $fh or croak $!;
|
|
|
|
$ret;
|
|
|
|
}
|
|
|
|
|
2007-01-22 13:20:33 +03:00
|
|
|
sub find_rev_before {
|
|
|
|
my ($self, $rev, $eq_ok) = @_;
|
|
|
|
--$rev unless $eq_ok;
|
|
|
|
while ($rev > 0) {
|
|
|
|
if (my $c = $self->rev_db_get($rev)) {
|
|
|
|
return ($rev, $c);
|
|
|
|
}
|
|
|
|
--$rev;
|
|
|
|
}
|
|
|
|
return (undef, undef);
|
|
|
|
}
|
|
|
|
|
2007-01-11 23:14:21 +03:00
|
|
|
sub _new {
|
2007-01-19 04:50:01 +03:00
|
|
|
my ($class, $repo_id, $ref_id, $path) = @_;
|
|
|
|
unless (defined $repo_id && length $repo_id) {
|
|
|
|
$repo_id = $Git::SVN::default_repo_id;
|
|
|
|
}
|
|
|
|
unless (defined $ref_id && length $ref_id) {
|
|
|
|
$_[2] = $ref_id = $repo_id;
|
|
|
|
}
|
|
|
|
$_[1] = $repo_id = sanitize_remote_name($repo_id);
|
|
|
|
my $dir = "$ENV{GIT_DIR}/svn/$ref_id";
|
|
|
|
$_[3] = $path = '' unless (defined $path);
|
|
|
|
bless { ref_id => $ref_id, dir => $dir, index => "$dir/index",
|
|
|
|
path => $path,
|
|
|
|
db_path => "$dir/.rev_db", repo_id => $repo_id }, $class;
|
2007-01-11 23:14:21 +03:00
|
|
|
}
|
|
|
|
|
2007-01-14 13:17:00 +03:00
|
|
|
sub uri_encode {
|
|
|
|
my ($f) = @_;
|
|
|
|
$f =~ s#([^a-zA-Z0-9\*!\:_\./\-])#uc sprintf("%%%02x",ord($1))#eg;
|
|
|
|
$f
|
|
|
|
}
|
2007-01-11 23:14:21 +03:00
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
package Git::SVN::Prompt;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
require SVN::Core;
|
|
|
|
use vars qw/$_no_auth_cache $_username/;
|
|
|
|
|
|
|
|
sub simple {
|
2006-11-24 12:38:04 +03:00
|
|
|
my ($cred, $realm, $default_username, $may_save, $pool) = @_;
|
|
|
|
$may_save = undef if $_no_auth_cache;
|
|
|
|
$default_username = $_username if defined $_username;
|
|
|
|
if (defined $default_username && length $default_username) {
|
|
|
|
if (defined $realm && length $realm) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "Authentication realm: $realm\n";
|
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
$cred->username($default_username);
|
|
|
|
} else {
|
2007-01-04 11:45:03 +03:00
|
|
|
username($cred, $realm, $may_save, $pool);
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
$cred->password(_read_password("Password for '" .
|
|
|
|
$cred->username . "': ", $realm));
|
|
|
|
$cred->may_save($may_save);
|
|
|
|
$SVN::_Core::SVN_NO_ERROR;
|
|
|
|
}
|
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
sub ssl_server_trust {
|
2006-11-24 12:38:04 +03:00
|
|
|
my ($cred, $realm, $failures, $cert_info, $may_save, $pool) = @_;
|
|
|
|
$may_save = undef if $_no_auth_cache;
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "Error validating server certificate for '$realm':\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
if ($failures & $SVN::Auth::SSL::UNKNOWNCA) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR " - The certificate is not issued by a trusted ",
|
2006-11-24 12:38:04 +03:00
|
|
|
"authority. Use the\n",
|
|
|
|
" fingerprint to validate the certificate manually!\n";
|
|
|
|
}
|
|
|
|
if ($failures & $SVN::Auth::SSL::CNMISMATCH) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR " - The certificate hostname does not match.\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
if ($failures & $SVN::Auth::SSL::NOTYETVALID) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR " - The certificate is not yet valid.\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
if ($failures & $SVN::Auth::SSL::EXPIRED) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR " - The certificate has expired.\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
if ($failures & $SVN::Auth::SSL::OTHER) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR " - The certificate has an unknown error.\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
2007-01-16 07:15:55 +03:00
|
|
|
printf STDERR
|
|
|
|
"Certificate information:\n".
|
2006-11-24 12:38:04 +03:00
|
|
|
" - Hostname: %s\n".
|
|
|
|
" - Valid: from %s until %s\n".
|
|
|
|
" - Issuer: %s\n".
|
|
|
|
" - Fingerprint: %s\n",
|
|
|
|
map $cert_info->$_, qw(hostname valid_from valid_until
|
2007-01-16 07:15:55 +03:00
|
|
|
issuer_dname fingerprint);
|
2006-11-24 12:38:04 +03:00
|
|
|
my $choice;
|
|
|
|
prompt:
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR $may_save ?
|
2006-11-24 12:38:04 +03:00
|
|
|
"(R)eject, accept (t)emporarily or accept (p)ermanently? " :
|
|
|
|
"(R)eject or accept (t)emporarily? ";
|
2007-01-16 07:15:55 +03:00
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
$choice = lc(substr(<STDIN> || 'R', 0, 1));
|
|
|
|
if ($choice =~ /^t$/i) {
|
|
|
|
$cred->may_save(undef);
|
|
|
|
} elsif ($choice =~ /^r$/i) {
|
|
|
|
return -1;
|
|
|
|
} elsif ($may_save && $choice =~ /^p$/i) {
|
|
|
|
$cred->may_save($may_save);
|
|
|
|
} else {
|
|
|
|
goto prompt;
|
|
|
|
}
|
|
|
|
$cred->accepted_failures($failures);
|
|
|
|
$SVN::_Core::SVN_NO_ERROR;
|
|
|
|
}
|
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
sub ssl_client_cert {
|
2006-11-24 12:38:04 +03:00
|
|
|
my ($cred, $realm, $may_save, $pool) = @_;
|
|
|
|
$may_save = undef if $_no_auth_cache;
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "Client certificate filename: ";
|
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
chomp(my $filename = <STDIN>);
|
|
|
|
$cred->cert_file($filename);
|
|
|
|
$cred->may_save($may_save);
|
|
|
|
$SVN::_Core::SVN_NO_ERROR;
|
|
|
|
}
|
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
sub ssl_client_cert_pw {
|
2006-11-24 12:38:04 +03:00
|
|
|
my ($cred, $realm, $may_save, $pool) = @_;
|
|
|
|
$may_save = undef if $_no_auth_cache;
|
|
|
|
$cred->password(_read_password("Password: ", $realm));
|
|
|
|
$cred->may_save($may_save);
|
|
|
|
$SVN::_Core::SVN_NO_ERROR;
|
|
|
|
}
|
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
sub username {
|
2006-11-24 12:38:04 +03:00
|
|
|
my ($cred, $realm, $may_save, $pool) = @_;
|
|
|
|
$may_save = undef if $_no_auth_cache;
|
|
|
|
if (defined $realm && length $realm) {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "Authentication realm: $realm\n";
|
2006-11-24 12:38:04 +03:00
|
|
|
}
|
|
|
|
my $username;
|
|
|
|
if (defined $_username) {
|
|
|
|
$username = $_username;
|
|
|
|
} else {
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "Username: ";
|
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
chomp($username = <STDIN>);
|
|
|
|
}
|
|
|
|
$cred->username($username);
|
|
|
|
$cred->may_save($may_save);
|
|
|
|
$SVN::_Core::SVN_NO_ERROR;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub _read_password {
|
|
|
|
my ($prompt, $realm) = @_;
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR $prompt;
|
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
require Term::ReadKey;
|
|
|
|
Term::ReadKey::ReadMode('noecho');
|
|
|
|
my $password = '';
|
|
|
|
while (defined(my $key = Term::ReadKey::ReadKey(0))) {
|
|
|
|
last if $key =~ /[\012\015]/; # \n\r
|
|
|
|
$password .= $key;
|
|
|
|
}
|
|
|
|
Term::ReadKey::ReadMode('restore');
|
2007-01-16 07:15:55 +03:00
|
|
|
print STDERR "\n";
|
|
|
|
STDERR->flush;
|
2006-11-24 12:38:04 +03:00
|
|
|
$password;
|
|
|
|
}
|
|
|
|
|
2007-01-04 11:45:03 +03:00
|
|
|
package main;
|
|
|
|
|
2006-12-13 01:47:00 +03:00
|
|
|
sub uri_encode {
|
|
|
|
my ($f) = @_;
|
|
|
|
$f =~ s#([^a-zA-Z0-9\*!\:_\./\-])#uc sprintf("%%%02x",ord($1))#eg;
|
|
|
|
$f
|
|
|
|
}
|
|
|
|
|
|
|
|
sub uri_decode {
|
|
|
|
my ($f) = @_;
|
|
|
|
$f =~ tr/+/ /;
|
|
|
|
$f =~ s/%([A-F0-9]{2})/chr hex($1)/ge;
|
|
|
|
$f
|
|
|
|
}
|
|
|
|
|
2006-12-16 10:58:07 +03:00
|
|
|
{
|
|
|
|
my $kill_stupid_warnings = $SVN::Node::none.$SVN::Node::file.
|
|
|
|
$SVN::Node::dir.$SVN::Node::unknown.
|
|
|
|
$SVN::Node::none.$SVN::Node::file.
|
|
|
|
$SVN::Node::dir.$SVN::Node::unknown.
|
|
|
|
$SVN::Auth::SSL::CNMISMATCH.
|
|
|
|
$SVN::Auth::SSL::NOTYETVALID.
|
|
|
|
$SVN::Auth::SSL::EXPIRED.
|
|
|
|
$SVN::Auth::SSL::UNKNOWNCA.
|
|
|
|
$SVN::Auth::SSL::OTHER;
|
|
|
|
}
|
|
|
|
|
2006-11-28 08:44:48 +03:00
|
|
|
package SVN::Git::Fetcher;
|
|
|
|
use vars qw/@ISA/;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
use Carp qw/croak/;
|
|
|
|
use IO::File qw//;
|
|
|
|
|
|
|
|
# file baton members: path, mode_a, mode_b, pool, fh, blob, base
|
|
|
|
sub new {
|
|
|
|
my ($class, $git_svn) = @_;
|
|
|
|
my $self = SVN::Delta::Editor->new;
|
|
|
|
bless $self, $class;
|
2007-01-14 13:17:00 +03:00
|
|
|
$self->{c} = $git_svn->{last_commit} if exists $git_svn->{last_commit};
|
2007-01-19 04:50:01 +03:00
|
|
|
if (length $git_svn->{path}) {
|
|
|
|
$self->{path_strip} = qr/\Q$git_svn->{path}\E\/?/;
|
|
|
|
}
|
2006-12-13 01:47:00 +03:00
|
|
|
$self->{empty} = {};
|
|
|
|
$self->{dir_prop} = {};
|
|
|
|
$self->{file_prop} = {};
|
|
|
|
$self->{absent_dir} = {};
|
|
|
|
$self->{absent_file} = {};
|
2007-01-14 13:17:00 +03:00
|
|
|
($self->{gui}, $self->{ctx}) = $git_svn->tmp_index_do(
|
|
|
|
sub { command_input_pipe(qw/update-index -z --index-info/) } );
|
2006-11-28 08:44:48 +03:00
|
|
|
require Digest::MD5;
|
|
|
|
$self;
|
|
|
|
}
|
|
|
|
|
2006-12-13 01:47:00 +03:00
|
|
|
sub open_root {
|
|
|
|
{ path => '' };
|
|
|
|
}
|
|
|
|
|
|
|
|
sub open_directory {
|
|
|
|
my ($self, $path, $pb, $rev) = @_;
|
|
|
|
{ path => $path };
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
sub git_path {
|
|
|
|
my ($self, $path) = @_;
|
|
|
|
$path =~ s!$self->{path_strip}!! if $self->{path_strip};
|
|
|
|
$path;
|
|
|
|
}
|
|
|
|
|
2006-11-28 08:44:48 +03:00
|
|
|
sub delete_entry {
|
|
|
|
my ($self, $path, $rev, $pb) = @_;
|
2007-01-04 12:38:18 +03:00
|
|
|
my $gui = $self->{gui};
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
my $gpath = $self->git_path($path);
|
2007-01-04 12:38:18 +03:00
|
|
|
# remove entire directories.
|
2007-01-19 04:50:01 +03:00
|
|
|
if (command('ls-tree', $self->{c}, '--', $gpath) =~ /^040000 tree/) {
|
2007-01-04 12:38:18 +03:00
|
|
|
my ($ls, $ctx) = command_output_pipe(qw/ls-tree
|
|
|
|
-r --name-only -z/,
|
2007-01-19 04:50:01 +03:00
|
|
|
$self->{c}, '--', $gpath);
|
2007-01-04 12:38:18 +03:00
|
|
|
local $/ = "\0";
|
|
|
|
while (<$ls>) {
|
|
|
|
print $gui '0 ',0 x 40,"\t",$_ or croak $!;
|
|
|
|
print "\tD\t$_\n" unless $self->{q};
|
|
|
|
}
|
2007-01-19 04:50:01 +03:00
|
|
|
print "\tD\t$gpath/\n" unless $self->{q};
|
2007-01-04 12:38:18 +03:00
|
|
|
command_close_pipe($ls, $ctx);
|
|
|
|
$self->{empty}->{$path} = 0
|
|
|
|
} else {
|
2007-01-19 04:50:01 +03:00
|
|
|
print $gui '0 ',0 x 40,"\t",$gpath,"\0" or croak $!;
|
|
|
|
print "\tD\t$gpath\n" unless $self->{q};
|
2007-01-04 12:38:18 +03:00
|
|
|
}
|
2006-11-28 08:44:48 +03:00
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub open_file {
|
|
|
|
my ($self, $path, $pb, $rev) = @_;
|
2007-01-19 04:50:01 +03:00
|
|
|
my $gpath = $self->git_path($path);
|
|
|
|
my ($mode, $blob) = (command('ls-tree', $self->{c}, '--', $gpath)
|
2006-11-28 08:44:48 +03:00
|
|
|
=~ /^(\d{6}) blob ([a-f\d]{40})\t/);
|
2006-12-08 12:55:19 +03:00
|
|
|
unless (defined $mode && defined $blob) {
|
|
|
|
die "$path was not found in commit $self->{c} (r$rev)\n";
|
|
|
|
}
|
2006-11-28 08:44:48 +03:00
|
|
|
{ path => $path, mode_a => $mode, mode_b => $mode, blob => $blob,
|
2006-11-28 13:50:17 +03:00
|
|
|
pool => SVN::Pool->new, action => 'M' };
|
2006-11-28 08:44:48 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
sub add_file {
|
|
|
|
my ($self, $path, $pb, $cp_path, $cp_rev) = @_;
|
2006-12-13 01:47:00 +03:00
|
|
|
my ($dir, $file) = ($path =~ m#^(.*?)/?([^/]+)$#);
|
|
|
|
delete $self->{empty}->{$dir};
|
2006-11-28 08:44:48 +03:00
|
|
|
{ path => $path, mode_a => 100644, mode_b => 100644,
|
2006-11-28 13:50:17 +03:00
|
|
|
pool => SVN::Pool->new, action => 'A' };
|
2006-11-28 08:44:48 +03:00
|
|
|
}
|
|
|
|
|
2006-12-13 01:47:00 +03:00
|
|
|
sub add_directory {
|
|
|
|
my ($self, $path, $cp_path, $cp_rev) = @_;
|
|
|
|
my ($dir, $file) = ($path =~ m#^(.*?)/?([^/]+)$#);
|
|
|
|
delete $self->{empty}->{$dir};
|
|
|
|
$self->{empty}->{$path} = 1;
|
|
|
|
{ path => $path };
|
|
|
|
}
|
|
|
|
|
|
|
|
sub change_dir_prop {
|
|
|
|
my ($self, $db, $prop, $value) = @_;
|
|
|
|
$self->{dir_prop}->{$db->{path}} ||= {};
|
|
|
|
$self->{dir_prop}->{$db->{path}}->{$prop} = $value;
|
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub absent_directory {
|
|
|
|
my ($self, $path, $pb) = @_;
|
|
|
|
$self->{absent_dir}->{$pb->{path}} ||= [];
|
|
|
|
push @{$self->{absent_dir}->{$pb->{path}}}, $path;
|
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub absent_file {
|
|
|
|
my ($self, $path, $pb) = @_;
|
|
|
|
$self->{absent_file}->{$pb->{path}} ||= [];
|
|
|
|
push @{$self->{absent_file}->{$pb->{path}}}, $path;
|
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
2006-11-28 08:44:48 +03:00
|
|
|
sub change_file_prop {
|
|
|
|
my ($self, $fb, $prop, $value) = @_;
|
|
|
|
if ($prop eq 'svn:executable') {
|
|
|
|
if ($fb->{mode_b} != 120000) {
|
|
|
|
$fb->{mode_b} = defined $value ? 100755 : 100644;
|
|
|
|
}
|
|
|
|
} elsif ($prop eq 'svn:special') {
|
|
|
|
$fb->{mode_b} = defined $value ? 120000 : 100644;
|
2006-12-13 01:47:00 +03:00
|
|
|
} else {
|
|
|
|
$self->{file_prop}->{$fb->{path}} ||= {};
|
|
|
|
$self->{file_prop}->{$fb->{path}}->{$prop} = $value;
|
2006-11-28 08:44:48 +03:00
|
|
|
}
|
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub apply_textdelta {
|
|
|
|
my ($self, $fb, $exp) = @_;
|
|
|
|
my $fh = IO::File->new_tmpfile;
|
|
|
|
$fh->autoflush(1);
|
|
|
|
# $fh gets auto-closed() by SVN::TxDelta::apply(),
|
|
|
|
# (but $base does not,) so dup() it for reading in close_file
|
|
|
|
open my $dup, '<&', $fh or croak $!;
|
|
|
|
my $base = IO::File->new_tmpfile;
|
|
|
|
$base->autoflush(1);
|
|
|
|
if ($fb->{blob}) {
|
|
|
|
defined (my $pid = fork) or croak $!;
|
|
|
|
if (!$pid) {
|
|
|
|
open STDOUT, '>&', $base or croak $!;
|
|
|
|
print STDOUT 'link ' if ($fb->{mode_a} == 120000);
|
|
|
|
exec qw/git-cat-file blob/, $fb->{blob} or croak $!;
|
|
|
|
}
|
|
|
|
waitpid $pid, 0;
|
|
|
|
croak $? if $?;
|
|
|
|
|
|
|
|
if (defined $exp) {
|
|
|
|
seek $base, 0, 0 or croak $!;
|
|
|
|
my $md5 = Digest::MD5->new;
|
|
|
|
$md5->addfile($base);
|
|
|
|
my $got = $md5->hexdigest;
|
|
|
|
die "Checksum mismatch: $fb->{path} $fb->{blob}\n",
|
|
|
|
"expected: $exp\n",
|
|
|
|
" got: $got\n" if ($got ne $exp);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
seek $base, 0, 0 or croak $!;
|
|
|
|
$fb->{fh} = $dup;
|
|
|
|
$fb->{base} = $base;
|
|
|
|
[ SVN::TxDelta::apply($base, $fh, undef, $fb->{path}, $fb->{pool}) ];
|
|
|
|
}
|
|
|
|
|
|
|
|
sub close_file {
|
|
|
|
my ($self, $fb, $exp) = @_;
|
|
|
|
my $hash;
|
2007-01-19 04:50:01 +03:00
|
|
|
my $path = $self->git_path($fb->{path});
|
2006-11-28 08:44:48 +03:00
|
|
|
if (my $fh = $fb->{fh}) {
|
|
|
|
seek($fh, 0, 0) or croak $!;
|
|
|
|
my $md5 = Digest::MD5->new;
|
|
|
|
$md5->addfile($fh);
|
|
|
|
my $got = $md5->hexdigest;
|
|
|
|
die "Checksum mismatch: $path\n",
|
|
|
|
"expected: $exp\n got: $got\n" if ($got ne $exp);
|
|
|
|
seek($fh, 0, 0) or croak $!;
|
|
|
|
if ($fb->{mode_b} == 120000) {
|
|
|
|
read($fh, my $buf, 5) == 5 or croak $!;
|
|
|
|
$buf eq 'link ' or die "$path has mode 120000",
|
|
|
|
"but is not a link\n";
|
|
|
|
}
|
|
|
|
defined(my $pid = open my $out,'-|') or die "Can't fork: $!\n";
|
|
|
|
if (!$pid) {
|
|
|
|
open STDIN, '<&', $fh or croak $!;
|
|
|
|
exec qw/git-hash-object -w --stdin/ or croak $!;
|
|
|
|
}
|
|
|
|
chomp($hash = do { local $/; <$out> });
|
|
|
|
close $out or croak $!;
|
|
|
|
close $fh or croak $!;
|
|
|
|
$hash =~ /^[a-f\d]{40}$/ or die "not a sha1: $hash\n";
|
|
|
|
close $fb->{base} or croak $!;
|
|
|
|
} else {
|
|
|
|
$hash = $fb->{blob} or die "no blob information\n";
|
|
|
|
}
|
|
|
|
$fb->{pool}->clear;
|
|
|
|
my $gui = $self->{gui};
|
|
|
|
print $gui "$fb->{mode_b} $hash\t$path\0" or croak $!;
|
2006-11-28 13:50:17 +03:00
|
|
|
print "\t$fb->{action}\t$path\n" if $fb->{action} && ! $self->{q};
|
2006-11-28 08:44:48 +03:00
|
|
|
undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub abort_edit {
|
|
|
|
my $self = shift;
|
2006-12-15 21:59:54 +03:00
|
|
|
eval { command_close_pipe($self->{gui}, $self->{ctx}) };
|
2006-11-28 08:44:48 +03:00
|
|
|
$self->SUPER::abort_edit(@_);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub close_edit {
|
|
|
|
my $self = shift;
|
2006-12-15 21:59:54 +03:00
|
|
|
command_close_pipe($self->{gui}, $self->{ctx});
|
2006-11-29 01:06:05 +03:00
|
|
|
$self->{git_commit_ok} = 1;
|
2006-11-28 08:44:48 +03:00
|
|
|
$self->SUPER::close_edit(@_);
|
|
|
|
}
|
2006-06-16 13:55:13 +04:00
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
package SVN::Git::Editor;
|
|
|
|
use vars qw/@ISA/;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
use Carp qw/croak/;
|
|
|
|
use IO::File;
|
|
|
|
|
|
|
|
sub new {
|
|
|
|
my $class = shift;
|
|
|
|
my $git_svn = shift;
|
|
|
|
my $self = SVN::Delta::Editor->new(@_);
|
|
|
|
bless $self, $class;
|
2007-01-14 09:35:53 +03:00
|
|
|
foreach (qw/svn_path r ra/) {
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
die "$_ required!\n" unless (defined $git_svn->{$_});
|
|
|
|
$self->{$_} = $git_svn->{$_};
|
|
|
|
}
|
|
|
|
$self->{pool} = SVN::Pool->new;
|
|
|
|
$self->{bat} = { '' => $self->open_root($self->{r}, $self->{pool}) };
|
|
|
|
$self->{rm} = { };
|
|
|
|
require Digest::MD5;
|
|
|
|
return $self;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub split_path {
|
|
|
|
return ($_[0] =~ m#^(.*?)/?([^/]+)$#);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub repo_path {
|
2006-11-25 09:38:17 +03:00
|
|
|
(defined $_[1] && length $_[1]) ? $_[1] : ''
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
sub url_path {
|
|
|
|
my ($self, $path) = @_;
|
|
|
|
$self->{ra}->{url} . '/' . $self->repo_path($path);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub rmdirs {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $tree_b) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my $rm = $self->{rm};
|
|
|
|
delete $rm->{''}; # we never delete the url we're tracking
|
|
|
|
return unless %$rm;
|
|
|
|
|
|
|
|
foreach (keys %$rm) {
|
|
|
|
my @d = split m#/#, $_;
|
|
|
|
my $c = shift @d;
|
|
|
|
$rm->{$c} = 1;
|
|
|
|
while (@d) {
|
|
|
|
$c .= '/' . shift @d;
|
|
|
|
$rm->{$c} = 1;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
delete $rm->{$self->{svn_path}};
|
|
|
|
delete $rm->{''}; # we never delete the url we're tracking
|
|
|
|
return unless %$rm;
|
|
|
|
|
2006-12-15 21:59:54 +03:00
|
|
|
my ($fh, $ctx) = command_output_pipe(
|
2007-01-14 09:35:53 +03:00
|
|
|
qw/ls-tree --name-only -r -z/, $tree_b);
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
local $/ = "\0";
|
|
|
|
while (<$fh>) {
|
|
|
|
chomp;
|
2006-11-25 09:38:17 +03:00
|
|
|
my @dn = split m#/#, $_;
|
2006-06-20 04:59:35 +04:00
|
|
|
while (pop @dn) {
|
|
|
|
delete $rm->{join '/', @dn};
|
|
|
|
}
|
|
|
|
unless (%$rm) {
|
2007-02-02 00:12:26 +03:00
|
|
|
close $fh;
|
2006-06-20 04:59:35 +04:00
|
|
|
return;
|
|
|
|
}
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
}
|
2006-12-15 21:59:54 +03:00
|
|
|
command_close_pipe($fh, $ctx);
|
2006-06-20 04:59:35 +04:00
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($r, $p, $bat) = ($self->{r}, $self->{pool}, $self->{bat});
|
|
|
|
foreach my $d (sort { $b =~ tr#/#/# <=> $a =~ tr#/#/# } keys %$rm) {
|
|
|
|
$self->close_directory($bat->{$d}, $p);
|
|
|
|
my ($dn) = ($d =~ m#^(.*?)/?(?:[^/]+)$#);
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\tD+\t$d/\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->SUPER::delete_entry($d, $r, $bat->{$dn}, $p);
|
|
|
|
delete $bat->{$d};
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub open_or_add_dir {
|
|
|
|
my ($self, $full_path, $baton) = @_;
|
2007-01-10 12:22:38 +03:00
|
|
|
my $t = $self->{ra}->check_path($full_path, $self->{r});
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
if ($t == $SVN::Node::none) {
|
|
|
|
return $self->add_directory($full_path, $baton,
|
|
|
|
undef, -1, $self->{pool});
|
|
|
|
} elsif ($t == $SVN::Node::dir) {
|
|
|
|
return $self->open_directory($full_path, $baton,
|
|
|
|
$self->{r}, $self->{pool});
|
|
|
|
}
|
|
|
|
print STDERR "$full_path already exists in repository at ",
|
|
|
|
"r$self->{r} and it is not a directory (",
|
|
|
|
($t == $SVN::Node::file ? 'file' : 'unknown'),"/$t)\n";
|
|
|
|
exit 1;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub ensure_path {
|
|
|
|
my ($self, $path) = @_;
|
|
|
|
my $bat = $self->{bat};
|
|
|
|
$path = $self->repo_path($path);
|
|
|
|
return $bat->{''} unless (length $path);
|
|
|
|
my @p = split m#/+#, $path;
|
|
|
|
my $c = shift @p;
|
|
|
|
$bat->{$c} ||= $self->open_or_add_dir($c, $bat->{''});
|
|
|
|
while (@p) {
|
|
|
|
my $c0 = $c;
|
|
|
|
$c .= '/' . shift @p;
|
|
|
|
$bat->{$c} ||= $self->open_or_add_dir($c, $bat->{$c0});
|
|
|
|
}
|
|
|
|
return $bat->{$c};
|
|
|
|
}
|
|
|
|
|
|
|
|
sub A {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $m) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($dir, $file) = split_path($m->{file_b});
|
|
|
|
my $pbat = $self->ensure_path($dir);
|
|
|
|
my $fbat = $self->add_file($self->repo_path($m->{file_b}), $pbat,
|
|
|
|
undef, -1);
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\tA\t$m->{file_b}\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->chg_file($fbat, $m);
|
|
|
|
$self->close_file($fbat,undef,$self->{pool});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub C {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $m) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($dir, $file) = split_path($m->{file_b});
|
|
|
|
my $pbat = $self->ensure_path($dir);
|
|
|
|
my $fbat = $self->add_file($self->repo_path($m->{file_b}), $pbat,
|
|
|
|
$self->url_path($m->{file_a}), $self->{r});
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\tC\t$m->{file_a} => $m->{file_b}\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->chg_file($fbat, $m);
|
|
|
|
$self->close_file($fbat,undef,$self->{pool});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub delete_entry {
|
|
|
|
my ($self, $path, $pbat) = @_;
|
|
|
|
my $rpath = $self->repo_path($path);
|
|
|
|
my ($dir, $file) = split_path($rpath);
|
|
|
|
$self->{rm}->{$dir} = 1;
|
|
|
|
$self->SUPER::delete_entry($rpath, $self->{r}, $pbat, $self->{pool});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub R {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $m) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($dir, $file) = split_path($m->{file_b});
|
|
|
|
my $pbat = $self->ensure_path($dir);
|
|
|
|
my $fbat = $self->add_file($self->repo_path($m->{file_b}), $pbat,
|
|
|
|
$self->url_path($m->{file_a}), $self->{r});
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\tR\t$m->{file_a} => $m->{file_b}\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->chg_file($fbat, $m);
|
|
|
|
$self->close_file($fbat,undef,$self->{pool});
|
|
|
|
|
|
|
|
($dir, $file) = split_path($m->{file_a});
|
|
|
|
$pbat = $self->ensure_path($dir);
|
|
|
|
$self->delete_entry($m->{file_a}, $pbat);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub M {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $m) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($dir, $file) = split_path($m->{file_b});
|
|
|
|
my $pbat = $self->ensure_path($dir);
|
|
|
|
my $fbat = $self->open_file($self->repo_path($m->{file_b}),
|
|
|
|
$pbat,$self->{r},$self->{pool});
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\t$m->{chg}\t$m->{file_b}\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->chg_file($fbat, $m);
|
|
|
|
$self->close_file($fbat,undef,$self->{pool});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub T { shift->M(@_) }
|
|
|
|
|
|
|
|
sub change_file_prop {
|
|
|
|
my ($self, $fbat, $pname, $pval) = @_;
|
|
|
|
$self->SUPER::change_file_prop($fbat, $pname, $pval, $self->{pool});
|
|
|
|
}
|
|
|
|
|
|
|
|
sub chg_file {
|
|
|
|
my ($self, $fbat, $m) = @_;
|
|
|
|
if ($m->{mode_b} =~ /755$/ && $m->{mode_a} !~ /755$/) {
|
|
|
|
$self->change_file_prop($fbat,'svn:executable','*');
|
|
|
|
} elsif ($m->{mode_b} !~ /755$/ && $m->{mode_a} =~ /755$/) {
|
|
|
|
$self->change_file_prop($fbat,'svn:executable',undef);
|
|
|
|
}
|
|
|
|
my $fh = IO::File->new_tmpfile or croak $!;
|
|
|
|
if ($m->{mode_b} =~ /^120/) {
|
|
|
|
print $fh 'link ' or croak $!;
|
|
|
|
$self->change_file_prop($fbat,'svn:special','*');
|
|
|
|
} elsif ($m->{mode_a} =~ /^120/ && $m->{mode_b} !~ /^120/) {
|
|
|
|
$self->change_file_prop($fbat,'svn:special',undef);
|
|
|
|
}
|
|
|
|
defined(my $pid = fork) or croak $!;
|
|
|
|
if (!$pid) {
|
|
|
|
open STDOUT, '>&', $fh or croak $!;
|
|
|
|
exec qw/git-cat-file blob/, $m->{sha1_b} or croak $!;
|
|
|
|
}
|
|
|
|
waitpid $pid, 0;
|
|
|
|
croak $? if $?;
|
|
|
|
$fh->flush == 0 or croak $!;
|
|
|
|
seek $fh, 0, 0 or croak $!;
|
|
|
|
|
|
|
|
my $md5 = Digest::MD5->new;
|
|
|
|
$md5->addfile($fh) or croak $!;
|
|
|
|
seek $fh, 0, 0 or croak $!;
|
|
|
|
|
|
|
|
my $exp = $md5->hexdigest;
|
2006-10-15 02:48:35 +04:00
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my $atd = $self->apply_textdelta($fbat, undef, $pool);
|
|
|
|
my $got = SVN::TxDelta::send_stream($fh, @$atd, $pool);
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
die "Checksum mismatch\nexpected: $exp\ngot: $got\n" if ($got ne $exp);
|
2006-10-15 02:48:35 +04:00
|
|
|
$pool->clear;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
|
|
|
|
close $fh or croak $!;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub D {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $m) = @_;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
my ($dir, $file) = split_path($m->{file_b});
|
|
|
|
my $pbat = $self->ensure_path($dir);
|
2007-01-14 09:35:53 +03:00
|
|
|
print "\tD\t$m->{file_b}\n" unless $::_q;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
$self->delete_entry($m->{file_b}, $pbat);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub close_edit {
|
|
|
|
my ($self) = @_;
|
|
|
|
my ($p,$bat) = ($self->{pool}, $self->{bat});
|
|
|
|
foreach (sort { $b =~ tr#/#/# <=> $a =~ tr#/#/# } keys %$bat) {
|
|
|
|
$self->close_directory($bat->{$_}, $p);
|
|
|
|
}
|
|
|
|
$self->SUPER::close_edit($p);
|
|
|
|
$p->clear;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub abort_edit {
|
|
|
|
my ($self) = @_;
|
|
|
|
$self->SUPER::abort_edit($self->{pool});
|
|
|
|
$self->{pool}->clear;
|
|
|
|
}
|
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
# this drives the editor
|
|
|
|
sub apply_diff {
|
|
|
|
my ($self, $tree_a, $tree_b) = @_;
|
|
|
|
my @diff_tree = qw(diff-tree -z -r);
|
|
|
|
if ($::_cp_similarity) {
|
|
|
|
push @diff_tree, "-C$::_cp_similarity";
|
|
|
|
} else {
|
|
|
|
push @diff_tree, '-C';
|
|
|
|
}
|
|
|
|
push @diff_tree, '--find-copies-harder' if $::_find_copies_harder;
|
|
|
|
push @diff_tree, "-l$::_l" if defined $::_l;
|
|
|
|
push @diff_tree, $tree_a, $tree_b;
|
|
|
|
my ($diff_fh, $ctx) = command_output_pipe(@diff_tree);
|
|
|
|
my $nl = $/;
|
|
|
|
local $/ = "\0";
|
|
|
|
my $state = 'meta';
|
|
|
|
my @mods;
|
|
|
|
while (<$diff_fh>) {
|
|
|
|
chomp $_; # this gets rid of the trailing "\0"
|
|
|
|
if ($state eq 'meta' && /^:(\d{6})\s(\d{6})\s
|
|
|
|
$::sha1\s($::sha1)\s
|
|
|
|
([MTCRAD])\d*$/xo) {
|
|
|
|
push @mods, { mode_a => $1, mode_b => $2,
|
|
|
|
sha1_b => $3, chg => $4 };
|
|
|
|
if ($4 =~ /^(?:C|R)$/) {
|
|
|
|
$state = 'file_a';
|
|
|
|
} else {
|
|
|
|
$state = 'file_b';
|
|
|
|
}
|
|
|
|
} elsif ($state eq 'file_a') {
|
|
|
|
my $x = $mods[$#mods] or croak "Empty array\n";
|
|
|
|
if ($x->{chg} !~ /^(?:C|R)$/) {
|
|
|
|
croak "Error parsing $_, $x->{chg}\n";
|
|
|
|
}
|
|
|
|
$x->{file_a} = $_;
|
|
|
|
$state = 'file_b';
|
|
|
|
} elsif ($state eq 'file_b') {
|
|
|
|
my $x = $mods[$#mods] or croak "Empty array\n";
|
|
|
|
if (exists $x->{file_a} && $x->{chg} !~ /^(?:C|R)$/) {
|
|
|
|
croak "Error parsing $_, $x->{chg}\n";
|
|
|
|
}
|
|
|
|
if (!exists $x->{file_a} && $x->{chg} =~ /^(?:C|R)$/) {
|
|
|
|
croak "Error parsing $_, $x->{chg}\n";
|
|
|
|
}
|
|
|
|
$x->{file_b} = $_;
|
|
|
|
$state = 'meta';
|
|
|
|
} else {
|
|
|
|
croak "Error parsing $_\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
command_close_pipe($diff_fh, $ctx);
|
|
|
|
$/ = $nl;
|
|
|
|
|
|
|
|
my %o = ( D => 1, R => 0, C => -1, A => 3, M => 3, T => 3 );
|
|
|
|
foreach my $m (sort { $o{$a->{chg}} <=> $o{$b->{chg}} } @mods) {
|
|
|
|
my $f = $m->{chg};
|
|
|
|
if (defined $o{$f}) {
|
|
|
|
$self->$f($m);
|
|
|
|
} else {
|
|
|
|
fatal("Invalid change type: $f\n");
|
|
|
|
}
|
|
|
|
}
|
|
|
|
$self->rmdirs($tree_b) if $::_rmdir;
|
|
|
|
if (@mods == 0) {
|
|
|
|
$self->abort_edit;
|
|
|
|
} else {
|
|
|
|
$self->close_edit;
|
|
|
|
}
|
|
|
|
\@mods;
|
|
|
|
}
|
|
|
|
|
2007-01-10 12:22:38 +03:00
|
|
|
package Git::SVN::Ra;
|
|
|
|
use vars qw/@ISA $config_dir/;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
my ($can_do_switch);
|
2007-01-19 05:22:18 +03:00
|
|
|
my %RA;
|
2007-01-10 12:22:38 +03:00
|
|
|
|
|
|
|
BEGIN {
|
|
|
|
# enforce temporary pool usage for some simple functions
|
|
|
|
my $e;
|
|
|
|
foreach (qw/get_latest_revnum rev_proplist get_file
|
|
|
|
check_path get_dir get_uuid get_repos_root/) {
|
|
|
|
$e .= "sub $_ {
|
|
|
|
my \$self = shift;
|
|
|
|
my \$pool = SVN::Pool->new;
|
|
|
|
my \@ret = \$self->SUPER::$_(\@_,\$pool);
|
|
|
|
\$pool->clear;
|
|
|
|
wantarray ? \@ret : \$ret[0]; }\n";
|
|
|
|
}
|
|
|
|
eval $e;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub new {
|
|
|
|
my ($class, $url) = @_;
|
2007-01-19 05:22:18 +03:00
|
|
|
$url =~ s!/+$!!;
|
|
|
|
return $RA{$url} if $RA{$url};
|
|
|
|
|
2007-01-10 12:22:38 +03:00
|
|
|
SVN::_Core::svn_config_ensure($config_dir, undef);
|
|
|
|
my ($baton, $callbacks) = SVN::Core::auth_open_helper([
|
|
|
|
SVN::Client::get_simple_provider(),
|
|
|
|
SVN::Client::get_ssl_server_trust_file_provider(),
|
|
|
|
SVN::Client::get_simple_prompt_provider(
|
|
|
|
\&Git::SVN::Prompt::simple, 2),
|
|
|
|
SVN::Client::get_ssl_client_cert_prompt_provider(
|
|
|
|
\&Git::SVN::Prompt::ssl_client_cert, 2),
|
|
|
|
SVN::Client::get_ssl_client_cert_pw_prompt_provider(
|
|
|
|
\&Git::SVN::Prompt::ssl_client_cert_pw, 2),
|
|
|
|
SVN::Client::get_username_provider(),
|
|
|
|
SVN::Client::get_ssl_server_trust_prompt_provider(
|
|
|
|
\&Git::SVN::Prompt::ssl_server_trust),
|
|
|
|
SVN::Client::get_username_prompt_provider(
|
|
|
|
\&Git::SVN::Prompt::username, 2),
|
|
|
|
]);
|
|
|
|
my $config = SVN::Core::config_get_config($config_dir);
|
|
|
|
my $self = SVN::Ra->new(url => $url, auth => $baton,
|
|
|
|
config => $config,
|
|
|
|
pool => SVN::Pool->new,
|
|
|
|
auth_provider_callbacks => $callbacks);
|
|
|
|
$self->{svn_path} = $url;
|
|
|
|
$self->{repos_root} = $self->get_repos_root;
|
|
|
|
$self->{svn_path} =~ s#^\Q$self->{repos_root}\E/*##;
|
2007-01-19 05:22:18 +03:00
|
|
|
$RA{$url} = bless $self, $class;
|
2007-01-10 12:22:38 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
sub DESTROY {
|
2007-01-19 05:22:18 +03:00
|
|
|
# do not call the real DESTROY since we store ourselves in %RA
|
2007-01-10 12:22:38 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
sub dup {
|
|
|
|
my ($self) = @_;
|
|
|
|
my $dup = SVN::Ra->new(pool => SVN::Pool->new,
|
|
|
|
map { $_ => $self->{$_} } qw/config url
|
|
|
|
auth auth_provider_callbacks repos_root svn_path/);
|
|
|
|
bless $dup, ref $self;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub get_log {
|
|
|
|
my ($self, @args) = @_;
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
$args[4]-- if $args[4] && ! $::_follow_parent;
|
|
|
|
splice(@args, 3, 1) if ($SVN::Core::VERSION le '1.2.0');
|
|
|
|
my $ret = $self->SUPER::get_log(@args, $pool);
|
|
|
|
$pool->clear;
|
|
|
|
$ret;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub get_commit_editor {
|
2007-01-14 09:35:53 +03:00
|
|
|
my ($self, $log, $cb, $pool) = @_;
|
2007-01-10 12:22:38 +03:00
|
|
|
my @lock = $SVN::Core::VERSION ge '1.2.0' ? (undef, 0) : ();
|
2007-01-14 09:35:53 +03:00
|
|
|
$self->SUPER::get_commit_editor($log, $cb, @lock, $pool);
|
2007-01-10 12:22:38 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
sub uuid {
|
|
|
|
my ($self) = @_;
|
|
|
|
$self->{uuid} ||= $self->get_uuid;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub gs_do_update {
|
|
|
|
my ($self, $rev_a, $rev_b, $path, $recurse, $editor) = @_;
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my $reporter = $self->do_update($rev_b, $path, $recurse,
|
|
|
|
$editor, $pool);
|
|
|
|
my @lock = $SVN::Core::VERSION ge '1.2.0' ? (undef) : ();
|
|
|
|
my $new = ($rev_a == $rev_b);
|
2007-01-19 04:50:01 +03:00
|
|
|
$reporter->set_path('', $rev_a, $new, @lock, $pool);
|
2007-01-10 12:22:38 +03:00
|
|
|
$reporter->finish_report($pool);
|
|
|
|
$pool->clear;
|
|
|
|
$editor->{git_commit_ok};
|
|
|
|
}
|
|
|
|
|
|
|
|
sub gs_do_switch {
|
|
|
|
my ($self, $rev_a, $rev_b, $path, $recurse, $url_b, $editor) = @_;
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my $reporter = $self->do_switch($rev_b, $path, $recurse,
|
|
|
|
$url_b, $editor, $pool);
|
|
|
|
my @lock = $SVN::Core::VERSION ge '1.2.0' ? (undef) : ();
|
|
|
|
$reporter->set_path($path, $rev_a, 0, @lock, $pool);
|
|
|
|
$reporter->finish_report($pool);
|
|
|
|
$pool->clear;
|
|
|
|
$editor->{git_commit_ok};
|
|
|
|
}
|
|
|
|
|
|
|
|
sub can_do_switch {
|
|
|
|
my $self = shift;
|
|
|
|
unless (defined $can_do_switch) {
|
|
|
|
my $pool = SVN::Pool->new;
|
|
|
|
my $rep = eval {
|
|
|
|
$self->do_switch(1, '', 0, $self->{url},
|
|
|
|
SVN::Delta::Editor->new, $pool);
|
|
|
|
};
|
|
|
|
if ($@) {
|
|
|
|
$can_do_switch = 0;
|
|
|
|
} else {
|
|
|
|
$rep->abort_report($pool);
|
|
|
|
$can_do_switch = 1;
|
|
|
|
}
|
|
|
|
$pool->clear;
|
|
|
|
}
|
|
|
|
$can_do_switch;
|
|
|
|
}
|
|
|
|
|
2007-01-12 13:35:20 +03:00
|
|
|
package Git::SVN::Log;
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
use POSIX qw/strftime/;
|
|
|
|
use vars qw/$TZ $limit $color $pager $non_recursive $verbose $oneline
|
|
|
|
%rusers $show_commit $incremental/;
|
|
|
|
my $l_fmt;
|
|
|
|
|
|
|
|
sub cmt_showable {
|
|
|
|
my ($c) = @_;
|
|
|
|
return 1 if defined $c->{r};
|
|
|
|
if ($c->{l} && $c->{l}->[-1] eq "...\n" &&
|
|
|
|
$c->{a_raw} =~ /\@([a-f\d\-]+)>$/) {
|
2007-01-14 09:35:53 +03:00
|
|
|
my @log = command(qw/cat-file commit/, $c->{c});
|
|
|
|
shift @log while ($log[0] ne "\n");
|
|
|
|
shift @log;
|
|
|
|
@{$c->{l}} = grep !/^git-svn-id: /, @log;
|
2007-01-12 13:35:20 +03:00
|
|
|
|
|
|
|
(undef, $c->{r}, undef) = ::extract_metadata(
|
2007-01-14 09:35:53 +03:00
|
|
|
(grep(/^git-svn-id: /, @log))[-1]);
|
2007-01-12 13:35:20 +03:00
|
|
|
}
|
|
|
|
return defined $c->{r};
|
|
|
|
}
|
|
|
|
|
|
|
|
sub log_use_color {
|
|
|
|
return 1 if $color;
|
|
|
|
my ($dc, $dcvar);
|
|
|
|
$dcvar = 'color.diff';
|
|
|
|
$dc = `git-config --get $dcvar`;
|
|
|
|
if ($dc eq '') {
|
|
|
|
# nothing at all; fallback to "diff.color"
|
|
|
|
$dcvar = 'diff.color';
|
|
|
|
$dc = `git-config --get $dcvar`;
|
|
|
|
}
|
|
|
|
chomp($dc);
|
|
|
|
if ($dc eq 'auto') {
|
|
|
|
my $pc;
|
|
|
|
$pc = `git-config --get color.pager`;
|
|
|
|
if ($pc eq '') {
|
|
|
|
# does not have it -- fallback to pager.color
|
|
|
|
$pc = `git-config --bool --get pager.color`;
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
$pc = `git-config --bool --get color.pager`;
|
|
|
|
if ($?) {
|
|
|
|
$pc = 'false';
|
|
|
|
}
|
|
|
|
}
|
|
|
|
chomp($pc);
|
|
|
|
if (-t *STDOUT || (defined $pager && $pc eq 'true')) {
|
|
|
|
return ($ENV{TERM} && $ENV{TERM} ne 'dumb');
|
|
|
|
}
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
return 0 if $dc eq 'never';
|
|
|
|
return 1 if $dc eq 'always';
|
|
|
|
chomp($dc = `git-config --bool --get $dcvar`);
|
|
|
|
return ($dc eq 'true');
|
|
|
|
}
|
|
|
|
|
|
|
|
sub git_svn_log_cmd {
|
|
|
|
my ($r_min, $r_max) = @_;
|
|
|
|
my $gs = Git::SVN->_new;
|
|
|
|
my @cmd = (qw/log --abbrev-commit --pretty=raw --default/,
|
|
|
|
$gs->refname);
|
|
|
|
push @cmd, '-r' unless $non_recursive;
|
|
|
|
push @cmd, qw/--raw --name-status/ if $verbose;
|
|
|
|
push @cmd, '--color' if log_use_color();
|
|
|
|
return @cmd unless defined $r_max;
|
|
|
|
if ($r_max == $r_min) {
|
|
|
|
push @cmd, '--max-count=1';
|
|
|
|
if (my $c = $gs->rev_db_get($r_max)) {
|
|
|
|
push @cmd, $c;
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
my ($c_min, $c_max);
|
|
|
|
$c_max = $gs->rev_db_get($r_max);
|
|
|
|
$c_min = $gs->rev_db_get($r_min);
|
|
|
|
if (defined $c_min && defined $c_max) {
|
|
|
|
if ($r_max > $r_max) {
|
|
|
|
push @cmd, "$c_min..$c_max";
|
|
|
|
} else {
|
|
|
|
push @cmd, "$c_max..$c_min";
|
|
|
|
}
|
|
|
|
} elsif ($r_max > $r_min) {
|
|
|
|
push @cmd, $c_max;
|
|
|
|
} else {
|
|
|
|
push @cmd, $c_min;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return @cmd;
|
|
|
|
}
|
|
|
|
|
|
|
|
# adapted from pager.c
|
|
|
|
sub config_pager {
|
|
|
|
$pager ||= $ENV{GIT_PAGER} || $ENV{PAGER};
|
|
|
|
if (!defined $pager) {
|
|
|
|
$pager = 'less';
|
|
|
|
} elsif (length $pager == 0 || $pager eq 'cat') {
|
|
|
|
$pager = undef;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub run_pager {
|
|
|
|
return unless -t *STDOUT;
|
|
|
|
pipe my $rfd, my $wfd or return;
|
|
|
|
defined(my $pid = fork) or ::fatal "Can't fork: $!\n";
|
|
|
|
if (!$pid) {
|
|
|
|
open STDOUT, '>&', $wfd or
|
|
|
|
::fatal "Can't redirect to stdout: $!\n";
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
open STDIN, '<&', $rfd or ::fatal "Can't redirect stdin: $!\n";
|
|
|
|
$ENV{LESS} ||= 'FRSX';
|
|
|
|
exec $pager or ::fatal "Can't run pager: $! ($pager)\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
sub get_author_info {
|
|
|
|
my ($dest, $author, $t, $tz) = @_;
|
|
|
|
$author =~ s/(?:^\s*|\s*$)//g;
|
|
|
|
$dest->{a_raw} = $author;
|
|
|
|
my $au;
|
2007-01-14 13:17:00 +03:00
|
|
|
if ($::_authors) {
|
2007-01-12 13:35:20 +03:00
|
|
|
$au = $rusers{$author} || undef;
|
|
|
|
}
|
|
|
|
if (!$au) {
|
|
|
|
($au) = ($author =~ /<([^>]+)\@[^>]+>$/);
|
|
|
|
}
|
|
|
|
$dest->{t} = $t;
|
|
|
|
$dest->{tz} = $tz;
|
|
|
|
$dest->{a} = $au;
|
|
|
|
# Date::Parse isn't in the standard Perl distro :(
|
|
|
|
if ($tz =~ s/^\+//) {
|
|
|
|
$t += ::tz_to_s_offset($tz);
|
|
|
|
} elsif ($tz =~ s/^\-//) {
|
|
|
|
$t -= ::tz_to_s_offset($tz);
|
|
|
|
}
|
|
|
|
$dest->{t_utc} = $t;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub process_commit {
|
|
|
|
my ($c, $r_min, $r_max, $defer) = @_;
|
|
|
|
if (defined $r_min && defined $r_max) {
|
|
|
|
if ($r_min == $c->{r} && $r_min == $r_max) {
|
|
|
|
show_commit($c);
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
return 1 if $r_min == $r_max;
|
|
|
|
if ($r_min < $r_max) {
|
|
|
|
# we need to reverse the print order
|
|
|
|
return 0 if (defined $limit && --$limit < 0);
|
|
|
|
push @$defer, $c;
|
|
|
|
return 1;
|
|
|
|
}
|
|
|
|
if ($r_min != $r_max) {
|
|
|
|
return 1 if ($r_min < $c->{r});
|
|
|
|
return 1 if ($r_max > $c->{r});
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return 0 if (defined $limit && --$limit < 0);
|
|
|
|
show_commit($c);
|
|
|
|
return 1;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub show_commit {
|
|
|
|
my $c = shift;
|
|
|
|
if ($oneline) {
|
|
|
|
my $x = "\n";
|
|
|
|
if (my $l = $c->{l}) {
|
|
|
|
while ($l->[0] =~ /^\s*$/) { shift @$l }
|
|
|
|
$x = $l->[0];
|
|
|
|
}
|
|
|
|
$l_fmt ||= 'A' . length($c->{r});
|
|
|
|
print 'r',pack($l_fmt, $c->{r}),' | ';
|
|
|
|
print "$c->{c} | " if $show_commit;
|
|
|
|
print $x;
|
|
|
|
} else {
|
|
|
|
show_commit_normal($c);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub show_commit_changed_paths {
|
|
|
|
my ($c) = @_;
|
|
|
|
return unless $c->{changed};
|
|
|
|
print "Changed paths:\n", @{$c->{changed}};
|
|
|
|
}
|
|
|
|
|
|
|
|
sub show_commit_normal {
|
|
|
|
my ($c) = @_;
|
|
|
|
print '-' x72, "\nr$c->{r} | ";
|
|
|
|
print "$c->{c} | " if $show_commit;
|
|
|
|
print "$c->{a} | ", strftime("%Y-%m-%d %H:%M:%S %z (%a, %d %b %Y)",
|
|
|
|
localtime($c->{t_utc})), ' | ';
|
|
|
|
my $nr_line = 0;
|
|
|
|
|
|
|
|
if (my $l = $c->{l}) {
|
|
|
|
while ($l->[$#$l] eq "\n" && $#$l > 0
|
|
|
|
&& $l->[($#$l - 1)] eq "\n") {
|
|
|
|
pop @$l;
|
|
|
|
}
|
|
|
|
$nr_line = scalar @$l;
|
|
|
|
if (!$nr_line) {
|
|
|
|
print "1 line\n\n\n";
|
|
|
|
} else {
|
|
|
|
if ($nr_line == 1) {
|
|
|
|
$nr_line = '1 line';
|
|
|
|
} else {
|
|
|
|
$nr_line .= ' lines';
|
|
|
|
}
|
|
|
|
print $nr_line, "\n";
|
|
|
|
show_commit_changed_paths($c);
|
|
|
|
print "\n";
|
|
|
|
print $_ foreach @$l;
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
print "1 line\n";
|
|
|
|
show_commit_changed_paths($c);
|
|
|
|
print "\n";
|
|
|
|
|
|
|
|
}
|
|
|
|
foreach my $x (qw/raw diff/) {
|
|
|
|
if ($c->{$x}) {
|
|
|
|
print "\n";
|
|
|
|
print $_ foreach @{$c->{$x}}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub cmd_show_log {
|
|
|
|
my (@args) = @_;
|
|
|
|
my ($r_min, $r_max);
|
|
|
|
my $r_last = -1; # prevent dupes
|
|
|
|
if (defined $TZ) {
|
|
|
|
$ENV{TZ} = $TZ;
|
|
|
|
} else {
|
|
|
|
delete $ENV{TZ};
|
|
|
|
}
|
|
|
|
if (defined $::_revision) {
|
|
|
|
if ($::_revision =~ /^(\d+):(\d+)$/) {
|
|
|
|
($r_min, $r_max) = ($1, $2);
|
|
|
|
} elsif ($::_revision =~ /^\d+$/) {
|
|
|
|
$r_min = $r_max = $::_revision;
|
|
|
|
} else {
|
|
|
|
::fatal "-r$::_revision is not supported, use ",
|
|
|
|
"standard \'git log\' arguments instead\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
config_pager();
|
|
|
|
@args = (git_svn_log_cmd($r_min, $r_max), @args);
|
|
|
|
my $log = command_output_pipe(@args);
|
|
|
|
run_pager();
|
|
|
|
my (@k, $c, $d);
|
|
|
|
my $esc_color = qr/(?:\033\[(?:(?:\d+;)*\d*)?m)*/;
|
|
|
|
while (<$log>) {
|
|
|
|
if (/^${esc_color}commit ($::sha1_short)/o) {
|
|
|
|
my $cmt = $1;
|
|
|
|
if ($c && cmt_showable($c) && $c->{r} != $r_last) {
|
|
|
|
$r_last = $c->{r};
|
|
|
|
process_commit($c, $r_min, $r_max, \@k) or
|
|
|
|
goto out;
|
|
|
|
}
|
|
|
|
$d = undef;
|
|
|
|
$c = { c => $cmt };
|
|
|
|
} elsif (/^${esc_color}author (.+) (\d+) ([\-\+]?\d+)$/o) {
|
|
|
|
get_author_info($c, $1, $2, $3);
|
|
|
|
} elsif (/^${esc_color}(?:tree|parent|committer) /o) {
|
|
|
|
# ignore
|
|
|
|
} elsif (/^${esc_color}:\d{6} \d{6} $::sha1_short/o) {
|
|
|
|
push @{$c->{raw}}, $_;
|
|
|
|
} elsif (/^${esc_color}[ACRMDT]\t/) {
|
|
|
|
# we could add $SVN->{svn_path} here, but that requires
|
|
|
|
# remote access at the moment (repo_path_split)...
|
|
|
|
s#^(${esc_color})([ACRMDT])\t#$1 $2 #o;
|
|
|
|
push @{$c->{changed}}, $_;
|
|
|
|
} elsif (/^${esc_color}diff /o) {
|
|
|
|
$d = 1;
|
|
|
|
push @{$c->{diff}}, $_;
|
|
|
|
} elsif ($d) {
|
|
|
|
push @{$c->{diff}}, $_;
|
|
|
|
} elsif (/^${esc_color} (git-svn-id:.+)$/o) {
|
|
|
|
($c->{url}, $c->{r}, undef) = ::extract_metadata($1);
|
|
|
|
} elsif (s/^${esc_color} //o) {
|
|
|
|
push @{$c->{l}}, $_;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if ($c && defined $c->{r} && $c->{r} != $r_last) {
|
|
|
|
$r_last = $c->{r};
|
|
|
|
process_commit($c, $r_min, $r_max, \@k);
|
|
|
|
}
|
|
|
|
if (@k) {
|
|
|
|
my $swap = $r_max;
|
|
|
|
$r_max = $r_min;
|
|
|
|
$r_min = $swap;
|
|
|
|
process_commit($_, $r_min, $r_max) foreach reverse @k;
|
|
|
|
}
|
|
|
|
out:
|
2007-01-12 14:07:31 +03:00
|
|
|
close $log;
|
2007-01-12 13:35:20 +03:00
|
|
|
print '-' x72,"\n" unless $incremental || $oneline;
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
package Git::SVN::Migration;
|
|
|
|
# these version numbers do NOT correspond to actual version numbers
|
|
|
|
# of git nor git-svn. They are just relative.
|
|
|
|
#
|
|
|
|
# v0 layout: .git/$id/info/url, refs/heads/$id-HEAD
|
|
|
|
#
|
|
|
|
# v1 layout: .git/$id/info/url, refs/remotes/$id
|
|
|
|
#
|
|
|
|
# v2 layout: .git/svn/$id/info/url, refs/remotes/$id
|
|
|
|
#
|
|
|
|
# v3 layout: .git/svn/$id, refs/remotes/$id
|
|
|
|
# - info/url may remain for backwards compatibility
|
|
|
|
# - this is what we migrate up to this layout automatically,
|
|
|
|
# - this will be used by git svn init on single branches
|
|
|
|
#
|
|
|
|
# v4 layout: .git/svn/$repo_id/$id, refs/remotes/$repo_id/$id
|
|
|
|
# - this is only created for newly multi-init-ed
|
|
|
|
# repositories. Similar in spirit to the
|
|
|
|
# --use-separate-remotes option in git-clone (now default)
|
|
|
|
# - we do not automatically migrate to this (following
|
|
|
|
# the example set by core git)
|
|
|
|
use strict;
|
|
|
|
use warnings;
|
|
|
|
use Carp qw/croak/;
|
|
|
|
use File::Path qw/mkpath/;
|
2007-01-21 15:27:09 +03:00
|
|
|
use File::Basename qw/dirname basename/;
|
|
|
|
use vars qw/$_minimize/;
|
2007-01-19 04:50:01 +03:00
|
|
|
|
|
|
|
sub migrate_from_v0 {
|
|
|
|
my $git_dir = $ENV{GIT_DIR};
|
|
|
|
return undef unless -d $git_dir;
|
|
|
|
my ($fh, $ctx) = command_output_pipe(qw/rev-parse --symbolic --all/);
|
|
|
|
my $migrated = 0;
|
|
|
|
while (<$fh>) {
|
|
|
|
chomp;
|
|
|
|
my ($id, $orig_ref) = ($_, $_);
|
|
|
|
next unless $id =~ s#^refs/heads/(.+)-HEAD$#$1#;
|
|
|
|
next unless -f "$git_dir/$id/info/url";
|
|
|
|
my $new_ref = "refs/remotes/$id";
|
|
|
|
if (::verify_ref("$new_ref^0")) {
|
|
|
|
print STDERR "W: $orig_ref is probably an old ",
|
|
|
|
"branch used by an ancient version of ",
|
|
|
|
"git-svn.\n",
|
|
|
|
"However, $new_ref also exists.\n",
|
|
|
|
"We will not be able ",
|
|
|
|
"to use this branch until this ",
|
|
|
|
"ambiguity is resolved.\n";
|
|
|
|
next;
|
|
|
|
}
|
|
|
|
print STDERR "Migrating from v0 layout...\n" if !$migrated;
|
|
|
|
print STDERR "Renaming ref: $orig_ref => $new_ref\n";
|
|
|
|
command_noisy('update-ref', $new_ref, $orig_ref);
|
|
|
|
command_noisy('update-ref', '-d', $orig_ref, $orig_ref);
|
|
|
|
$migrated++;
|
|
|
|
}
|
|
|
|
command_close_pipe($fh, $ctx);
|
|
|
|
print STDERR "Done migrating from v0 layout...\n" if $migrated;
|
|
|
|
$migrated;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub migrate_from_v1 {
|
|
|
|
my $git_dir = $ENV{GIT_DIR};
|
|
|
|
my $migrated = 0;
|
|
|
|
return $migrated unless -d $git_dir;
|
|
|
|
my $svn_dir = "$git_dir/svn";
|
|
|
|
|
|
|
|
# just in case somebody used 'svn' as their $id at some point...
|
|
|
|
return $migrated if -d $svn_dir && ! -f "$svn_dir/info/url";
|
|
|
|
|
|
|
|
print STDERR "Migrating from a git-svn v1 layout...\n";
|
|
|
|
mkpath([$svn_dir]);
|
|
|
|
print STDERR "Data from a previous version of git-svn exists, but\n\t",
|
|
|
|
"$svn_dir\n\t(required for this version ",
|
|
|
|
"($::VERSION) of git-svn) does not. exist\n";
|
|
|
|
my ($fh, $ctx) = command_output_pipe(qw/rev-parse --symbolic --all/);
|
|
|
|
while (<$fh>) {
|
|
|
|
my $x = $_;
|
|
|
|
next unless $x =~ s#^refs/remotes/##;
|
|
|
|
chomp $x;
|
|
|
|
next unless -f "$git_dir/$x/info/url";
|
|
|
|
my $u = eval { ::file_to_s("$git_dir/$x/info/url") };
|
|
|
|
next unless $u;
|
|
|
|
my $dn = dirname("$git_dir/svn/$x");
|
|
|
|
mkpath([$dn]) unless -d $dn;
|
|
|
|
if ($x eq 'svn') { # they used 'svn' as GIT_SVN_ID:
|
|
|
|
mkpath(["$git_dir/svn/svn"]);
|
|
|
|
print STDERR " - $git_dir/$x/info => ",
|
|
|
|
"$git_dir/svn/$x/info\n";
|
|
|
|
rename "$git_dir/$x/info", "$git_dir/svn/$x/info" or
|
|
|
|
croak "$!: $x";
|
|
|
|
# don't worry too much about these, they probably
|
|
|
|
# don't exist with repos this old (save for index,
|
|
|
|
# and we can easily regenerate that)
|
|
|
|
foreach my $f (qw/unhandled.log index .rev_db/) {
|
|
|
|
rename "$git_dir/$x/$f", "$git_dir/svn/$x/$f";
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
print STDERR " - $git_dir/$x => $git_dir/svn/$x\n";
|
|
|
|
rename "$git_dir/$x", "$git_dir/svn/$x" or
|
|
|
|
croak "$!: $x";
|
|
|
|
}
|
|
|
|
$migrated++;
|
|
|
|
}
|
|
|
|
command_close_pipe($fh, $ctx);
|
|
|
|
print STDERR "Done migrating from a git-svn v1 layout\n";
|
|
|
|
$migrated;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub read_old_urls {
|
|
|
|
my ($l_map, $pfx, $path) = @_;
|
|
|
|
my @dir;
|
|
|
|
foreach (<$path/*>) {
|
|
|
|
if (-r "$_/info/url") {
|
|
|
|
$pfx .= '/' if $pfx && $pfx !~ m!/$!;
|
|
|
|
my $ref_id = $pfx . basename $_;
|
|
|
|
my $url = ::file_to_s("$_/info/url");
|
|
|
|
$l_map->{$ref_id} = $url;
|
|
|
|
} elsif (-d $_) {
|
|
|
|
push @dir, $_;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
foreach (@dir) {
|
|
|
|
my $x = $_;
|
|
|
|
$x =~ s!^\Q$ENV{GIT_DIR}\E/svn/!!o;
|
|
|
|
read_old_urls($l_map, $x, $_);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub migrate_from_v2 {
|
|
|
|
my @cfg = command(qw/config -l/);
|
|
|
|
return if grep /^svn-remote\..+\.url=/, @cfg;
|
|
|
|
my %l_map;
|
|
|
|
read_old_urls(\%l_map, '', "$ENV{GIT_DIR}/svn");
|
|
|
|
my $migrated = 0;
|
|
|
|
|
|
|
|
foreach my $ref_id (sort keys %l_map) {
|
2007-01-21 15:27:09 +03:00
|
|
|
Git::SVN->init($l_map{$ref_id}, '', $ref_id, $ref_id);
|
2007-01-19 04:50:01 +03:00
|
|
|
$migrated++;
|
|
|
|
}
|
|
|
|
$migrated;
|
|
|
|
}
|
|
|
|
|
2007-01-21 15:27:09 +03:00
|
|
|
sub minimize_connections {
|
|
|
|
my $r = Git::SVN::read_all_remotes();
|
|
|
|
my $new_urls = {};
|
|
|
|
my $root_repos = {};
|
|
|
|
foreach my $repo_id (keys %$r) {
|
|
|
|
my $url = $r->{$repo_id}->{url} or next;
|
|
|
|
my $fetch = $r->{$repo_id}->{fetch} or next;
|
|
|
|
my $ra = Git::SVN::Ra->new($url);
|
|
|
|
|
|
|
|
# skip existing cases where we already connect to the root
|
|
|
|
if (($ra->{url} eq $ra->{repos_root}) ||
|
|
|
|
(Git::SVN::sanitize_remote_name($ra->{repos_root}) eq
|
|
|
|
$repo_id)) {
|
|
|
|
$root_repos->{$ra->{url}} = $repo_id;
|
|
|
|
next;
|
|
|
|
}
|
|
|
|
|
|
|
|
my $root_ra = Git::SVN::Ra->new($ra->{repos_root});
|
|
|
|
my $root_path = $ra->{url};
|
|
|
|
$root_path =~ s#^\Q$ra->{repos_root}\E/*##;
|
|
|
|
foreach my $path (keys %$fetch) {
|
|
|
|
my $ref_id = $fetch->{$path};
|
|
|
|
my $gs = Git::SVN->new($ref_id, $repo_id, $path);
|
|
|
|
|
|
|
|
# make sure we can read when connecting to
|
|
|
|
# a higher level of a repository
|
|
|
|
my ($last_rev, undef) = $gs->last_rev_commit;
|
|
|
|
if (!defined $last_rev) {
|
|
|
|
$last_rev = eval {
|
|
|
|
$root_ra->get_latest_revnum;
|
|
|
|
};
|
|
|
|
next if $@;
|
|
|
|
}
|
|
|
|
my $new = $root_path;
|
|
|
|
$new .= length $path ? "/$path" : '';
|
|
|
|
eval {
|
|
|
|
$root_ra->get_log([$new], $last_rev, $last_rev,
|
|
|
|
0, 0, 1, sub { });
|
|
|
|
};
|
|
|
|
next if $@;
|
|
|
|
$new_urls->{$ra->{repos_root}}->{$new} =
|
|
|
|
{ ref_id => $ref_id,
|
|
|
|
old_repo_id => $repo_id,
|
|
|
|
old_path => $path };
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
my @emptied;
|
|
|
|
foreach my $url (keys %$new_urls) {
|
|
|
|
# see if we can re-use an existing [svn-remote "repo_id"]
|
|
|
|
# instead of creating a(n ugly) new section:
|
|
|
|
my $repo_id = $root_repos->{$url} ||
|
|
|
|
Git::SVN::sanitize_remote_name($url);
|
|
|
|
|
|
|
|
my $fetch = $new_urls->{$url};
|
|
|
|
foreach my $path (keys %$fetch) {
|
|
|
|
my $x = $fetch->{$path};
|
|
|
|
Git::SVN->init($url, $path, $repo_id, $x->{ref_id});
|
|
|
|
my $pfx = "svn-remote.$x->{old_repo_id}";
|
|
|
|
|
|
|
|
my $old_fetch = quotemeta("$x->{old_path}:".
|
|
|
|
"refs/remotes/$x->{ref_id}");
|
|
|
|
command_noisy(qw/repo-config --unset/,
|
|
|
|
"$pfx.fetch", '^'. $old_fetch . '$');
|
|
|
|
delete $r->{$x->{old_repo_id}}->
|
|
|
|
{fetch}->{$x->{old_path}};
|
|
|
|
if (!keys %{$r->{$x->{old_repo_id}}->{fetch}}) {
|
|
|
|
command_noisy(qw/repo-config --unset/,
|
|
|
|
"$pfx.url");
|
|
|
|
push @emptied, $x->{old_repo_id}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if (@emptied) {
|
|
|
|
my $file = $ENV{GIT_CONFIG} || $ENV{GIT_CONFIG_LOCAL} ||
|
|
|
|
"$ENV{GIT_DIR}/config";
|
|
|
|
print STDERR <<EOF;
|
|
|
|
The following [svn-remote] sections in your config file ($file) are empty
|
|
|
|
and can be safely removed:
|
|
|
|
EOF
|
|
|
|
print STDERR "[svn-remote \"$_\"]\n" foreach @emptied;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-19 04:50:01 +03:00
|
|
|
sub migration_check {
|
|
|
|
migrate_from_v0();
|
|
|
|
migrate_from_v1();
|
|
|
|
migrate_from_v2();
|
2007-01-21 15:27:09 +03:00
|
|
|
minimize_connections() if $_minimize;
|
2007-01-19 04:50:01 +03:00
|
|
|
}
|
|
|
|
|
2006-02-16 12:24:16 +03:00
|
|
|
__END__
|
|
|
|
|
|
|
|
Data structures:
|
|
|
|
|
2007-01-14 09:35:53 +03:00
|
|
|
$log_entry hashref as returned by libsvn_log_entry()
|
2006-02-16 12:24:16 +03:00
|
|
|
{
|
2007-01-14 09:35:53 +03:00
|
|
|
log => 'whitespace-formatted log entry
|
2006-02-16 12:24:16 +03:00
|
|
|
', # trailing newline is preserved
|
|
|
|
revision => '8', # integer
|
|
|
|
date => '2004-02-24T17:01:44.108345Z', # commit date
|
|
|
|
author => 'committer name'
|
|
|
|
};
|
|
|
|
|
|
|
|
@mods = array of diff-index line hashes, each element represents one line
|
|
|
|
of diff-index output
|
|
|
|
|
|
|
|
diff-index line ($m hash)
|
|
|
|
{
|
|
|
|
mode_a => first column of diff-index output, no leading ':',
|
|
|
|
mode_b => second column of diff-index output,
|
|
|
|
sha1_b => sha1sum of the final blob,
|
2006-03-03 12:20:07 +03:00
|
|
|
chg => change type [MCRADT],
|
2006-02-16 12:24:16 +03:00
|
|
|
file_a => original file name of a file (iff chg is 'C' or 'R')
|
|
|
|
file_b => new/current file name of a file (any chg)
|
|
|
|
}
|
|
|
|
;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
|
git-svn: add --follow-parent and --no-metadata options to fetch
--follow-parent:
This is especially helpful when we're tracking a directory
that has been moved around within the repository, or if we
started tracking a branch and never tracked the trunk it was
descended from.
This relies on the SVN::* libraries to work. We can't
reliably parse path info from the svn command-line client
without relying on XML, so it's better just to have the SVN::*
libs installed.
This also removes oldvalue verification when calling update-ref
In SVN, branches can be deleted, and then recreated under the
same path as the original one with different ancestry
information, causing parent information to be mismatched /
misordered.
Also force the current ref, if existing, to be a parent,
regardless of whether or not it was specified.
--no-metadata:
This gets rid of the git-svn-id: lines at the end of every commit.
With this, you lose the ability to use the rebuild command. If
you ever lose your .git/svn/git-svn/.rev_db file, you won't be
able to fetch again, either. This is fine for one-shot imports.
Also fix some issues with multi-fetch --follow-parent that were
exposed while testing this. Additionally, repack checking is
simplified greatly.
git-svn log will not work on repositories using this, either.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
2006-06-28 06:39:13 +04:00
|
|
|
# retval of read_url_paths{,_all}();
|
|
|
|
$l_map = {
|
|
|
|
# repository root url
|
|
|
|
'https://svn.musicpd.org' => {
|
|
|
|
# repository path # GIT_SVN_ID
|
|
|
|
'mpd/trunk' => 'trunk',
|
|
|
|
'mpd/tags/0.11.5' => 'tags/0.11.5',
|
|
|
|
},
|
|
|
|
}
|
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 02:23:48 +04:00
|
|
|
Notes:
|
|
|
|
I don't trust the each() function on unless I created %hash myself
|
|
|
|
because the internal iterator may not have started at base.
|