This commit is contained in:
Saeed Noursalehi 2017-02-02 22:33:24 -08:00
Коммит 3c09517e5d
328 изменённых файлов: 38997 добавлений и 0 удалений

4
.gitattributes поставляемый Normal file
Просмотреть файл

@ -0,0 +1,4 @@
###############################################################################
# Do not normalize any line endings.
###############################################################################
* -text

218
.gitignore поставляемый Normal file
Просмотреть файл

@ -0,0 +1,218 @@
## Ignore Visual Studio temporary files, build results, and
## files generated by popular Visual Studio add-ons.
# User-specific files
*.suo
*.user
*.userosscache
*.sln.docstates
# User-specific files (MonoDevelop/Xamarin Studio)
*.userprefs
# Build results
[Dd]ebug/
[Dd]ebugPublic/
[Rr]elease/
[Rr]eleases/
x64/
x86/
build/
bld/
[Bb]in/
[Oo]bj/
# Visual Studio 2015 cache/options directory
.vs/
# MSTest test Results
[Tt]est[Rr]esult*/
[Bb]uild[Ll]og.*
# NUNIT
*.VisualState.xml
TestResult.xml
# Build Results of an ATL Project
[Dd]ebugPS/
[Rr]eleasePS/
dlldata.c
# DNX
project.lock.json
artifacts/
*_i.c
*_p.c
*_i.h
*.ilk
*.meta
*.obj
*.pch
*.pdb
*.pgc
*.pgd
*.rsp
*.sbr
*.tlb
*.tli
*.tlh
*.tmp
*.tmp_proj
*.log
*.vspscc
*.vssscc
.builds
*.pidb
*.svclog
*.scc
# Chutzpah Test files
_Chutzpah*
# Visual C++ cache files
ipch/
*.aps
*.ncb
*.opensdf
*.sdf
*.cachefile
*.VC.opendb
*.VC.db
# Visual Studio profiler
*.psess
*.vsp
*.vspx
# TFS 2012 Local Workspace
$tf/
# Guidance Automation Toolkit
*.gpState
# ReSharper is a .NET coding add-in
_ReSharper*/
*.[Rr]e[Ss]harper
*.DotSettings.user
# JustCode is a .NET coding add-in
.JustCode
# TeamCity is a build add-in
_TeamCity*
# DotCover is a Code Coverage Tool
*.dotCover
# NCrunch
_NCrunch_*
.*crunch*.local.xml
# MightyMoose
*.mm.*
AutoTest.Net/
# Web workbench (sass)
.sass-cache/
# Installshield output folder
[Ee]xpress/
# DocProject is a documentation generator add-in
DocProject/buildhelp/
DocProject/Help/*.HxT
DocProject/Help/*.HxC
DocProject/Help/*.hhc
DocProject/Help/*.hhk
DocProject/Help/*.hhp
DocProject/Help/Html2
DocProject/Help/html
# Click-Once directory
publish/
# Publish Web Output
*.[Pp]ublish.xml
*.azurePubxml
## TODO: Comment the next line if you want to checkin your
## web deploy settings but do note that will include unencrypted
## passwords
#*.pubxml
*.publishproj
# NuGet Packages
*.nupkg
# The packages folder can be ignored because of Package Restore
**/packages/*
# except build/, which is used as an MSBuild target.
!**/packages/build/
# Uncomment if necessary however generally it will be regenerated when needed
#!**/packages/repositories.config
# Windows Azure Build Output
csx/
*.build.csdef
# Windows Store app package directory
AppPackages/
# Visual Studio cache files
# files ending in .cache can be ignored
*.[Cc]ache
# but keep track of directories ending in .cache
!*.[Cc]ache/
# Others
ClientBin/
[Ss]tyle[Cc]op.*
~$*
*~
*.dbmdl
*.dbproj.schemaview
*.pfx
*.publishsettings
node_modules/
orleans.codegen.cs
# RIA/Silverlight projects
Generated_Code/
# Backup & report files from converting an old project file
# to a newer Visual Studio version. Backup files are not needed,
# because we have git ;-)
_UpgradeReport_Files/
Backup*/
UpgradeLog*.XML
UpgradeLog*.htm
# SQL Server files
*.mdf
*.ldf
# Business Intelligence projects
*.rdl.data
*.bim.layout
*.bim_*.settings
# Microsoft Fakes
FakesAssemblies/
# Node.js Tools for Visual Studio
.ntvs_analysis.dat
# Visual Studio 6 build log
*.plg
# Visual Studio 6 workspace options file
*.opt
# LightSwitch generated files
GeneratedArtifacts/
_Pvt_Extensions/
ModelManifest.xml
*.dll
*.cab
*.cer

59
AuthoringTests.md Normal file
Просмотреть файл

@ -0,0 +1,59 @@
# Authoring Tests
## Functional Tests
### 1. Running the functional tests
Our functional tests are in the GVFS.FunctionalTests project. They are built on NUnit 3, which is available as a set of NuGet packages.
To run the functional tests:
1. Open GVFS.sln in Visual Studio
2. Build all, which will download the NUnit framework and runner
3. You have three options for how to run the tests, all of which are equivalent.
a. Run the GVFS.FunctionalTests project. Even better, set it as the default project and hit F5.
b. Use the command line runner. After building, execute ```Scripts\RunFunctionalTests.bat```
c. If you want to use Visual Studio's Test Explorer, you need to install the NUnit 3 Test Adapter in VS | Tools | Extensions and Updates.
Option 1 is probably the most convenient for developers. Option 2 will be used on the build machines.
The functional tests take a set of parameters that indicate what paths and URLs to work with. If you want to customize those settings, they
can be found in GVFS.FunctionalTests\App.config.
### 2. Running Full Suite of Tests vs. Smoke Tests
By default, the GVFS functional tests run a subset of tests as a quick smoke test for developers. To run all tests, pass in the `--full-suite` flag
### 3. Running specific tests
Specific tests can be run by specifying `--test=<comma separated list of tests>` as the command line arguments to the functional
test project.
### 4. How to write a functional test
Each piece of functionality that we add to GVFS should have corresponding functional tests that clone a repo, mount GVFS, and use existing tools and file system
APIs to interact with the virtual repo.
Since these are functional tests that can potentially modify the state of files on disk, you need to be careful to make sure each test can run in a clean
environment. There are three base classes that you can derive from when writing your tests. It's also important to put your new class into the same namespace
as the base class, because NUnit treats namespaces like test suites, and we have logic that keys off of that for deciding when to create enlistments.
1. TestsWithLongRunningEnlistment
Before any test in this namespace is executed, we create a single enlistment and mount GVFS. We then run all tests in this namespace that derive
from this base class. Only put tests in here that are purely readonly and will leave the repo in a good state for future tests.
2. TestsWithEnlistmentPerFixture
For any test fixture (a fixture is the same as a class in NUnit) that derives from this class, we create an enlistment and mount GVFS before running
any of the tests in the fixture, and then we unmount and delete the enlistment after all tests are done (but before any other fixture runs). If you need
to write a sequence of tests that manipulate the same repo, this is the right base class.
3. TestsWithEnlistmentPerTestCase
Derive from this class if you need a brand new enlistment per test case. This is the most reliable, but also most expensive option.
### 5. Updating the remote test branch
By default, GVFS.FunctionalTests clones master, checks out the branch "FunctionalTests/YYYYMMDD" (with the day the FunctionalTests branch was created),
and then removes all remote tracking information. This is done to guarantee that remote changes to tip cannot break functional tests. If you need to update
the functional tests to use a new FunctionalTests branch, you'll need to create a new "FunctionalTests/YYYYMMDD" branch and update the 'Commitish' setting in App.Config,
and the project properties (Settings.Designer.cs, and Settings.settings) to have this new branch name.
Once you have verified your scenarios locally you can push the new FuncationTests branch and then your changes.

146
GVFS.sln Normal file
Просмотреть файл

@ -0,0 +1,146 @@

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio 14
VisualStudioVersion = 14.0.25420.1
MinimumVisualStudioVersion = 10.0.40219.1
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{DCE11095-DA5F-4878-B58D-2702765560F5}"
ProjectSection(SolutionItems) = preProject
.gitattributes = .gitattributes
.gitignore = .gitignore
AuthoringTests.md = AuthoringTests.md
License.md = License.md
nuget.config = nuget.config
Protocol.md = Protocol.md
Readme.md = Readme.md
Settings.StyleCop = Settings.StyleCop
EndProjectSection
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "GVFS", "GVFS", "{2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.GVFlt", "GVFS\GVFS.GVFlt\GVFS.GVFlt.csproj", "{1118B427-7063-422F-83B9-5023C8EC5A7A}"
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "GVFS.GvFltWrapper", "GVFS\GVFS.GvFltWrapper\GVFS.GvFltWrapper.vcxproj", "{FB0831AE-9997-401B-B31F-3A065FDBEB20}"
ProjectSection(ProjectDependencies) = postProject
{5A6656D5-81C7-472C-9DC8-32D071CB2258} = {5A6656D5-81C7-472C-9DC8-32D071CB2258}
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09} = {374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}
EndProjectSection
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.Common", "GVFS\GVFS.Common\GVFS.Common.csproj", "{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS", "GVFS\GVFS\GVFS.csproj", "{32220664-594C-4425-B9A0-88E0BE2F3D2A}"
ProjectSection(ProjectDependencies) = postProject
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5} = {17498502-AEFF-4E70-90CC-1D0B56A8ADF5}
{5A6656D5-81C7-472C-9DC8-32D071CB2258} = {5A6656D5-81C7-472C-9DC8-32D071CB2258}
{BDA91EE5-C684-4FC5-A90A-B7D677421917} = {BDA91EE5-C684-4FC5-A90A-B7D677421917}
EndProjectSection
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "FastFetch", "GVFS\FastFetch\FastFetch.csproj", "{07F2A520-2AB7-46DD-97C0-75D8E988D55B}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.Tests", "GVFS\GVFS.Tests\GVFS.Tests.csproj", "{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "GVFS Tests", "GVFS Tests", "{C41F10F9-1163-4CFA-A465-EA728F75E9FA}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.UnitTests", "GVFS\GVFS.UnitTests\GVFS.UnitTests.csproj", "{8E0D0989-21F6-4DD8-946C-39F992523CC6}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.FunctionalTests", "GVFS\GVFS.FunctionalTests\GVFS.FunctionalTests.csproj", "{0F0A008E-AB12-40EC-A671-37A541B08C7F}"
ProjectSection(ProjectDependencies) = postProject
{07F2A520-2AB7-46DD-97C0-75D8E988D55B} = {07F2A520-2AB7-46DD-97C0-75D8E988D55B}
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5} = {3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}
{32220664-594C-4425-B9A0-88E0BE2F3D2A} = {32220664-594C-4425-B9A0-88E0BE2F3D2A}
{BDA91EE5-C684-4FC5-A90A-B7D677421917} = {BDA91EE5-C684-4FC5-A90A-B7D677421917}
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "GVFS.NativeTests", "GVFS\GVFS.NativeTests\GVFS.NativeTests.vcxproj", "{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.Hooks", "GVFS\GVFS.Hooks\GVFS.Hooks.csproj", "{BDA91EE5-C684-4FC5-A90A-B7D677421917}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "GVFS.Mount", "GVFS\GVFS.Mount\GVFS.Mount.csproj", "{17498502-AEFF-4E70-90CC-1D0B56A8ADF5}"
ProjectSection(ProjectDependencies) = postProject
{5A6656D5-81C7-472C-9DC8-32D071CB2258} = {5A6656D5-81C7-472C-9DC8-32D071CB2258}
EndProjectSection
EndProject
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "GVFS.ReadObjectHook", "GVFS\GVFS.ReadObjectHook\GVFS.ReadObjectHook.vcxproj", "{5A6656D5-81C7-472C-9DC8-32D071CB2258}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Scripts", "Scripts", "{28674A4B-1223-4633-A460-C8CC39B09318}"
ProjectSection(SolutionItems) = preProject
Scripts\CreateCommonAssemblyVersion.bat = Scripts\CreateCommonAssemblyVersion.bat
Scripts\CreateCommonCliAssemblyVersion.bat = Scripts\CreateCommonCliAssemblyVersion.bat
Scripts\CreateCommonVersionHeader.bat = Scripts\CreateCommonVersionHeader.bat
Scripts\RunFunctionalTests.bat = Scripts\RunFunctionalTests.bat
Scripts\RunUnitTests.bat = Scripts\RunUnitTests.bat
EndProjectSection
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|x64 = Debug|x64
Release|x64 = Release|x64
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{1118B427-7063-422F-83B9-5023C8EC5A7A}.Debug|x64.ActiveCfg = Debug|x64
{1118B427-7063-422F-83B9-5023C8EC5A7A}.Debug|x64.Build.0 = Debug|x64
{1118B427-7063-422F-83B9-5023C8EC5A7A}.Release|x64.ActiveCfg = Release|x64
{1118B427-7063-422F-83B9-5023C8EC5A7A}.Release|x64.Build.0 = Release|x64
{FB0831AE-9997-401B-B31F-3A065FDBEB20}.Debug|x64.ActiveCfg = Debug|x64
{FB0831AE-9997-401B-B31F-3A065FDBEB20}.Debug|x64.Build.0 = Debug|x64
{FB0831AE-9997-401B-B31F-3A065FDBEB20}.Release|x64.ActiveCfg = Release|x64
{FB0831AE-9997-401B-B31F-3A065FDBEB20}.Release|x64.Build.0 = Release|x64
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}.Debug|x64.ActiveCfg = Debug|x64
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}.Debug|x64.Build.0 = Debug|x64
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}.Release|x64.ActiveCfg = Release|x64
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}.Release|x64.Build.0 = Release|x64
{32220664-594C-4425-B9A0-88E0BE2F3D2A}.Debug|x64.ActiveCfg = Debug|x64
{32220664-594C-4425-B9A0-88E0BE2F3D2A}.Debug|x64.Build.0 = Debug|x64
{32220664-594C-4425-B9A0-88E0BE2F3D2A}.Release|x64.ActiveCfg = Release|x64
{32220664-594C-4425-B9A0-88E0BE2F3D2A}.Release|x64.Build.0 = Release|x64
{07F2A520-2AB7-46DD-97C0-75D8E988D55B}.Debug|x64.ActiveCfg = Debug|x64
{07F2A520-2AB7-46DD-97C0-75D8E988D55B}.Debug|x64.Build.0 = Debug|x64
{07F2A520-2AB7-46DD-97C0-75D8E988D55B}.Release|x64.ActiveCfg = Release|x64
{07F2A520-2AB7-46DD-97C0-75D8E988D55B}.Release|x64.Build.0 = Release|x64
{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}.Debug|x64.ActiveCfg = Debug|x64
{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}.Debug|x64.Build.0 = Debug|x64
{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}.Release|x64.ActiveCfg = Release|x64
{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}.Release|x64.Build.0 = Release|x64
{8E0D0989-21F6-4DD8-946C-39F992523CC6}.Debug|x64.ActiveCfg = Debug|x64
{8E0D0989-21F6-4DD8-946C-39F992523CC6}.Debug|x64.Build.0 = Debug|x64
{8E0D0989-21F6-4DD8-946C-39F992523CC6}.Release|x64.ActiveCfg = Release|x64
{8E0D0989-21F6-4DD8-946C-39F992523CC6}.Release|x64.Build.0 = Release|x64
{0F0A008E-AB12-40EC-A671-37A541B08C7F}.Debug|x64.ActiveCfg = Debug|x64
{0F0A008E-AB12-40EC-A671-37A541B08C7F}.Debug|x64.Build.0 = Debug|x64
{0F0A008E-AB12-40EC-A671-37A541B08C7F}.Release|x64.ActiveCfg = Release|x64
{0F0A008E-AB12-40EC-A671-37A541B08C7F}.Release|x64.Build.0 = Release|x64
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}.Debug|x64.ActiveCfg = Debug|x64
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}.Debug|x64.Build.0 = Debug|x64
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}.Release|x64.ActiveCfg = Release|x64
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5}.Release|x64.Build.0 = Release|x64
{BDA91EE5-C684-4FC5-A90A-B7D677421917}.Debug|x64.ActiveCfg = Debug|x64
{BDA91EE5-C684-4FC5-A90A-B7D677421917}.Debug|x64.Build.0 = Debug|x64
{BDA91EE5-C684-4FC5-A90A-B7D677421917}.Release|x64.ActiveCfg = Release|x64
{BDA91EE5-C684-4FC5-A90A-B7D677421917}.Release|x64.Build.0 = Release|x64
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5}.Debug|x64.ActiveCfg = Debug|x64
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5}.Debug|x64.Build.0 = Debug|x64
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5}.Release|x64.ActiveCfg = Release|x64
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5}.Release|x64.Build.0 = Release|x64
{5A6656D5-81C7-472C-9DC8-32D071CB2258}.Debug|x64.ActiveCfg = Debug|x64
{5A6656D5-81C7-472C-9DC8-32D071CB2258}.Debug|x64.Build.0 = Debug|x64
{5A6656D5-81C7-472C-9DC8-32D071CB2258}.Release|x64.ActiveCfg = Release|x64
{5A6656D5-81C7-472C-9DC8-32D071CB2258}.Release|x64.Build.0 = Release|x64
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{1118B427-7063-422F-83B9-5023C8EC5A7A} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{FB0831AE-9997-401B-B31F-3A065FDBEB20} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{32220664-594C-4425-B9A0-88E0BE2F3D2A} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{07F2A520-2AB7-46DD-97C0-75D8E988D55B} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC} = {C41F10F9-1163-4CFA-A465-EA728F75E9FA}
{8E0D0989-21F6-4DD8-946C-39F992523CC6} = {C41F10F9-1163-4CFA-A465-EA728F75E9FA}
{0F0A008E-AB12-40EC-A671-37A541B08C7F} = {C41F10F9-1163-4CFA-A465-EA728F75E9FA}
{3771C555-B5C1-45E2-B8B7-2CEF1619CDC5} = {C41F10F9-1163-4CFA-A465-EA728F75E9FA}
{BDA91EE5-C684-4FC5-A90A-B7D677421917} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{17498502-AEFF-4E70-90CC-1D0B56A8ADF5} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{5A6656D5-81C7-472C-9DC8-32D071CB2258} = {2EF2EC94-3A68-4ED7-9A58-B7057ADBA01C}
{28674A4B-1223-4633-A460-C8CC39B09318} = {DCE11095-DA5F-4878-B58D-2702765560F5}
EndGlobalSection
EndGlobal

Просмотреть файл

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5.2" />
</startup>
</configuration>

Просмотреть файл

@ -0,0 +1,113 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Import Project="$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props" Condition="Exists('$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props')" />
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProjectGuid>{07F2A520-2AB7-46DD-97C0-75D8E988D55B}</ProjectGuid>
<OutputType>Exe</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>FastFetch</RootNamespace>
<AssemblyName>FastFetch</AssemblyName>
<TargetFrameworkVersion>v4.5.2</TargetFrameworkVersion>
<FileAlignment>512</FileAlignment>
<AutoGenerateBindingRedirects>true</AutoGenerateBindingRedirects>
<NuGetPackageImportStamp>
</NuGetPackageImportStamp>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Debug|x64'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>..\..\..\BuildOutput\FastFetch\bin\x64\Debug\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\FastFetch\obj\x64\Debug\</IntermediateOutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>full</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<Prefer32Bit>true</Prefer32Bit>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Release|x64'">
<OutputPath>..\..\..\BuildOutput\FastFetch\bin\x64\Release\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\FastFetch\obj\x64\Release\</IntermediateOutputPath>
<DefineConstants>TRACE</DefineConstants>
<Optimize>true</Optimize>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>pdbonly</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<Prefer32Bit>true</Prefer32Bit>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
</PropertyGroup>
<ItemGroup>
<Reference Include="CommandLine">
<HintPath>..\..\..\packages\CommandLineParser.2.0.275-beta\lib\net45\CommandLine.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Microsoft.Diagnostics.Tracing.EventSource">
<HintPath>..\..\..\packages\Microsoft.Diagnostics.Tracing.EventSource.Redist.1.1.28\lib\net40\Microsoft.Diagnostics.Tracing.EventSource.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="System" />
<Reference Include="System.Core" />
<Reference Include="System.Xml.Linq" />
<Reference Include="System.Data.DataSetExtensions" />
<Reference Include="Microsoft.CSharp" />
<Reference Include="System.Data" />
<Reference Include="System.Net.Http" />
<Reference Include="System.Xml" />
</ItemGroup>
<ItemGroup>
<Compile Include="..\..\..\BuildOutput\CommonAssemblyVersion.cs">
<Link>CommonAssemblyVersion.cs</Link>
</Compile>
<Compile Include="FastFetchVerb.cs" />
<Compile Include="FetchHelper.cs" />
<Compile Include="GitEnlistment.cs" />
<Compile Include="Git\DiffHelper.cs" />
<Compile Include="Git\GitPackIndex.cs" />
<Compile Include="Git\RefSpecHelpers.cs" />
<Compile Include="Jobs\BatchObjectDownloadJob.cs" />
<Compile Include="Jobs\Data\BlobDownloadRequest.cs" />
<Compile Include="Jobs\Data\IndexPackRequest.cs" />
<Compile Include="Jobs\Data\TreeSearchRequest.cs" />
<Compile Include="Jobs\FindMissingBlobsJob.cs" />
<Compile Include="Jobs\IndexPackJob.cs" />
<Compile Include="Jobs\Job.cs" />
<Compile Include="Git\LsTreeHelper.cs" />
<Compile Include="Git\UpdateRefsHelper.cs" />
<Compile Include="Program.cs" />
<Compile Include="Properties\AssemblyInfo.cs" />
</ItemGroup>
<ItemGroup>
<None Include="App.config" />
<None Include="packages.config" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\GVFS.Common\GVFS.Common.csproj">
<Project>{374bf1e5-0b2d-4d4a-bd5e-4212299def09}</Project>
<Name>GVFS.Common</Name>
</ProjectReference>
</ItemGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
<PropertyGroup>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
</PropertyGroup>
<Error Condition="!Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets'))" />
<Error Condition="!Exists('..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets'))" />
<Error Condition="!Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets'))" />
</Target>
<Import Project="..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" />
<Import Project="..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets" Condition="Exists('..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets')" />
<Import Project="..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" />
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
</Project>

Просмотреть файл

@ -0,0 +1,219 @@
using CommandLine;
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
namespace FastFetch
{
[Verb("fastfetch", HelpText = "Fast-fetch a branch")]
public class FastFetchVerb
{
private const string DefaultBranch = "master";
[Option(
'c',
"commit",
Required = false,
HelpText = "Commit to fetch")]
public string Commit { get; set; }
[Option(
'b',
"branch",
Required = false,
HelpText = "Branch to fetch")]
public string Branch { get; set; }
[Option(
's',
"silent",
Required = false,
Default = false,
HelpText = "Disables console logging")]
public bool Silent { get; set; }
[Option(
"cache-server-url",
Required = false,
Default = "",
HelpText = "Defines the url of the cache server")]
public string CacheServerUrl { get; set; }
[Option(
"chunk-size",
Required = false,
Default = 4000,
HelpText = "Sets the number of objects to be downloaded in a single pack")]
public int ChunkSize { get; set; }
[Option(
"search-thread-count",
Required = false,
Default = 2,
HelpText = "Sets the number of threads to use for finding missing blobs. (0 for number of logical cores)")]
public int SearchThreadCount { get; set; }
[Option(
"download-thread-count",
Required = false,
Default = 0,
HelpText = "Sets the number of threads to use for downloading. (0 for number of logical cores)")]
public int DownloadThreadCount { get; set; }
[Option(
"index-thread-count",
Required = false,
Default = 0,
HelpText = "Sets the number of threads to use for indexing. (0 for number of logical cores)")]
public int IndexThreadCount { get; set; }
[Option(
"checkout-thread-count",
Required = false,
Default = 0,
HelpText = "Sets the number of threads to use for indexing. (0 for number of logical cores)")]
public int CheckoutThreadCount { get; set; }
[Option(
'r',
"max-retries",
Required = false,
Default = 10,
HelpText = "Sets the maximum number of retries for downloading a pack")]
public int MaxRetries { get; set; }
[Option(
"git-path",
Default = "",
Required = false,
HelpText = "Sets the path and filename for git.exe if it isn't expected to be on %PATH%.")]
public string GitBinPath { get; set; }
[Option(
"folders",
Required = false,
Default = "",
HelpText = "A semicolon-delimited list of paths to fetch")]
public string PathWhitelist { get; set; }
[Option(
"folders-list",
Required = false,
Default = "",
HelpText = "A file containing line-delimited list of paths to fetch")]
public string PathWhitelistFile { get; set; }
public void Execute()
{
// CmdParser doesn't strip quotes, and Path.Combine will throw
this.GitBinPath = this.GitBinPath.Replace("\"", string.Empty);
if (!GitProcess.GitExists(this.GitBinPath))
{
Console.WriteLine(
"Could not find git.exe {0}",
!string.IsNullOrWhiteSpace(this.GitBinPath) ? "at " + this.GitBinPath : "on %PATH%");
return;
}
if (this.Commit != null && this.Branch != null)
{
Console.WriteLine("Cannot specify both a commit sha and a branch name to checkout.");
return;
}
this.CacheServerUrl = Enlistment.StripObjectsEndpointSuffix(this.CacheServerUrl);
this.SearchThreadCount = this.SearchThreadCount > 0 ? this.SearchThreadCount : Environment.ProcessorCount;
this.DownloadThreadCount = this.DownloadThreadCount > 0 ? this.DownloadThreadCount : Environment.ProcessorCount;
this.IndexThreadCount = this.IndexThreadCount > 0 ? this.IndexThreadCount : Environment.ProcessorCount;
this.CheckoutThreadCount = this.CheckoutThreadCount > 0 ? this.CheckoutThreadCount : Environment.ProcessorCount;
this.GitBinPath = !string.IsNullOrWhiteSpace(this.GitBinPath) ? this.GitBinPath : GitProcess.GetInstalledGitBinPath();
Enlistment enlistment = (Enlistment)GVFSEnlistment.CreateFromCurrentDirectory(this.CacheServerUrl, this.GitBinPath)
?? GitEnlistment.CreateFromCurrentDirectory(this.CacheServerUrl, this.GitBinPath);
if (enlistment == null)
{
Console.WriteLine("Must be run within a .git repo or GVFS enlistment");
return;
}
string commitish = this.Commit ?? this.Branch ?? DefaultBranch;
EventLevel maxVerbosity = this.Silent ? EventLevel.LogAlways : EventLevel.Informational;
using (JsonEtwTracer tracer = new JsonEtwTracer("Microsoft.Git.FastFetch", "FastFetch"))
{
tracer.AddConsoleEventListener(maxVerbosity, Keywords.Any);
tracer.WriteStartEvent(
enlistment.EnlistmentRoot,
enlistment.RepoUrl,
enlistment.CacheServerUrl,
new EventMetadata
{
{ "TargetCommitish", commitish },
});
FetchHelper fetchHelper = this.GetFetchHelper(tracer, enlistment);
fetchHelper.MaxRetries = this.MaxRetries;
if (!FetchHelper.TryLoadPathWhitelist(this.PathWhitelist, this.PathWhitelistFile, tracer, fetchHelper.PathWhitelist))
{
Environment.ExitCode = 1;
return;
}
try
{
bool isBranch = this.Commit == null;
fetchHelper.FastFetch(commitish, isBranch);
if (fetchHelper.HasFailures)
{
Environment.ExitCode = 1;
}
}
catch (AggregateException e)
{
Environment.ExitCode = 1;
foreach (Exception ex in e.Flatten().InnerExceptions)
{
tracer.RelatedError(ex.ToString());
}
}
catch (Exception e)
{
Environment.ExitCode = 1;
tracer.RelatedError(e.ToString());
}
EventMetadata stopMetadata = new EventMetadata();
stopMetadata.Add("Success", Environment.ExitCode == 0);
tracer.Stop(stopMetadata);
}
if (Debugger.IsAttached)
{
Console.ReadKey();
}
}
private FetchHelper GetFetchHelper(ITracer tracer, Enlistment enlistment)
{
return new FetchHelper(
tracer,
enlistment,
this.ChunkSize,
this.SearchThreadCount,
this.DownloadThreadCount,
this.IndexThreadCount);
}
}
}

Просмотреть файл

@ -0,0 +1,216 @@
using FastFetch.Git;
using FastFetch.Jobs;
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace FastFetch
{
public class FetchHelper
{
protected readonly Enlistment Enlistment;
protected readonly HttpGitObjects HttpGitObjects;
protected readonly GitObjects GitObjects;
protected readonly ITracer Tracer;
protected readonly int ChunkSize;
protected readonly int SearchThreadCount;
protected readonly int DownloadThreadCount;
protected readonly int IndexThreadCount;
protected readonly bool SkipConfigUpdate;
private const string AreaPath = nameof(FetchHelper);
// Shallow clones don't require their parent commits
private const int CommitDepth = 1;
public FetchHelper(
ITracer tracer,
Enlistment enlistment,
int chunkSize,
int searchThreadCount,
int downloadThreadCount,
int indexThreadCount)
{
this.SearchThreadCount = searchThreadCount;
this.DownloadThreadCount = downloadThreadCount;
this.IndexThreadCount = indexThreadCount;
this.ChunkSize = chunkSize;
this.Tracer = tracer;
this.Enlistment = enlistment;
this.HttpGitObjects = new HttpGitObjects(tracer, enlistment, downloadThreadCount);
this.GitObjects = new GitObjects(tracer, enlistment, this.HttpGitObjects);
this.PathWhitelist = new List<string>();
// We never want to update config settings for a GVFSEnlistment
this.SkipConfigUpdate = enlistment is GVFSEnlistment;
}
public int MaxRetries
{
get { return this.HttpGitObjects.MaxRetries; }
set { this.HttpGitObjects.MaxRetries = value; }
}
public bool HasFailures { get; protected set; }
public List<string> PathWhitelist { get; private set; }
public static bool TryLoadPathWhitelist(string pathWhitelistInput, string pathWhitelistFile, ITracer tracer, List<string> pathWhitelistOutput)
{
Func<string, string> cleanPath = path => path.Trim(' ', '\r', '\n', '"').Replace('\\', '/').TrimStart('/');
pathWhitelistOutput.AddRange(pathWhitelistInput.Split(';').Select(cleanPath));
if (!string.IsNullOrWhiteSpace(pathWhitelistFile))
{
if (File.Exists(pathWhitelistFile))
{
pathWhitelistOutput.AddRange(File.ReadAllLines(pathWhitelistFile).Select(cleanPath));
}
else
{
tracer.RelatedError("Could not find '{0}' for folder filtering.", pathWhitelistFile);
Console.WriteLine("Could not find '{0}' for folder filtering.", pathWhitelistFile);
return false;
}
}
pathWhitelistOutput.RemoveAll(string.IsNullOrWhiteSpace);
return true;
}
/// <param name="branchOrCommit">A specific branch to filter for, or null for all branches returned from info/refs</param>
public virtual void FastFetch(string branchOrCommit, bool isBranch)
{
if (string.IsNullOrWhiteSpace(branchOrCommit))
{
throw new FetchException("Must specify branch or commit to fetch");
}
GitRefs refs = null;
string commitToFetch;
if (isBranch)
{
refs = this.HttpGitObjects.QueryInfoRefs(branchOrCommit);
if (refs == null)
{
throw new FetchException("Could not query info/refs from: {0}", this.Enlistment.RepoUrl);
}
else if (refs.Count == 0)
{
throw new FetchException("Could not find branch {0} in info/refs from: {1}", branchOrCommit, this.Enlistment.RepoUrl);
}
commitToFetch = refs.GetTipCommitIds().Single();
}
else
{
commitToFetch = branchOrCommit;
}
this.DownloadMissingCommit(commitToFetch, this.GitObjects);
// Dummy output queue since we don't need to checkout available blobs
BlockingCollection<string> availableBlobs = new BlockingCollection<string>();
// Configure pipeline
// LsTreeHelper output => FindMissingBlobs => BatchDownload => IndexPack
LsTreeHelper blobEnumerator = new LsTreeHelper(this.PathWhitelist, this.Tracer, this.Enlistment);
FindMissingBlobsJob blobFinder = new FindMissingBlobsJob(this.SearchThreadCount, blobEnumerator.BlobIdOutput, availableBlobs, this.Tracer, this.Enlistment);
BatchObjectDownloadJob downloader = new BatchObjectDownloadJob(this.DownloadThreadCount, this.ChunkSize, blobFinder.DownloadQueue, availableBlobs, this.Tracer, this.Enlistment, this.HttpGitObjects, this.GitObjects);
IndexPackJob packIndexer = new IndexPackJob(this.IndexThreadCount, downloader.AvailablePacks, availableBlobs, this.Tracer, this.GitObjects);
blobFinder.Start();
downloader.Start();
this.HasFailures |= !blobEnumerator.EnqueueAllBlobs(commitToFetch);
// If indexing happens during searching, searching progressively gets slower, so wait on searching before indexing.
blobFinder.WaitForCompletion();
this.HasFailures |= blobFinder.HasFailures;
// Index regardless of failures, it'll shorten the next fetch.
packIndexer.Start();
downloader.WaitForCompletion();
this.HasFailures |= downloader.HasFailures;
packIndexer.WaitForCompletion();
this.HasFailures |= packIndexer.HasFailures;
if (!this.SkipConfigUpdate)
{
this.UpdateRefs(branchOrCommit, isBranch, refs);
if (isBranch)
{
this.HasFailures |= !RefSpecHelpers.UpdateRefSpec(this.Tracer, this.Enlistment, branchOrCommit, refs);
}
}
}
/// <summary>
/// * Updates any remote branch (N/A for fetch of detached commit)
/// * Updates shallow file
/// </summary>
protected virtual void UpdateRefs(string branchOrCommit, bool isBranch, GitRefs refs)
{
UpdateRefsHelper refHelper = new UpdateRefsHelper(this.Enlistment);
string commitSha = null;
if (isBranch)
{
KeyValuePair<string, string> remoteRef = refs.GetBranchRefPairs().Single();
string remoteBranch = remoteRef.Key;
commitSha = remoteRef.Value;
this.HasFailures |= !refHelper.UpdateRef(this.Tracer, remoteBranch, commitSha);
}
else
{
commitSha = branchOrCommit;
}
// Update shallow file to ensure this is a valid shallow repo
File.AppendAllText(Path.Combine(this.Enlistment.WorkingDirectoryRoot, GVFSConstants.DotGit.Shallow), commitSha + "\n");
}
protected void DownloadMissingCommit(string commitSha, GitObjects gitObjects)
{
EventMetadata startMetadata = new EventMetadata();
startMetadata.Add("CommitSha", commitSha);
startMetadata.Add("CommitDepth", CommitDepth);
using (ITracer activity = this.Tracer.StartActivity("DownloadTrees", EventLevel.Informational, startMetadata))
{
using (GitCatFileBatchCheckProcess catFileProcess = new GitCatFileBatchCheckProcess(this.Enlistment))
{
if (!catFileProcess.ObjectExists(commitSha))
{
if (!gitObjects.TryDownloadAndSaveCommits(new[] { commitSha }, commitDepth: CommitDepth))
{
EventMetadata metadata = new EventMetadata();
metadata.Add("ObjectsEndpointUrl", this.Enlistment.ObjectsEndpointUrl);
activity.RelatedError(metadata);
throw new FetchException("Could not download commits from {0}", this.Enlistment.ObjectsEndpointUrl);
}
}
}
}
}
public class FetchException : Exception
{
public FetchException(string format, params object[] args)
: base(string.Format(format, args))
{
}
}
}
}

Просмотреть файл

@ -0,0 +1,265 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
namespace GVFS.Common.Git
{
public class DiffHelper
{
private const string AreaPath = nameof(DiffHelper);
private ITracer tracer;
private List<string> pathWhitelist;
private List<string> deletedPaths = new List<string>();
private HashSet<string> filesAdded = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private Enlistment enlistment;
private string targetCommitSha;
private int additionalDirDeletes = 0;
private int additionalFileDeletes = 0;
public DiffHelper(ITracer tracer, Enlistment enlistment, string targetCommitSha, IEnumerable<string> pathWhitelist)
{
this.tracer = tracer;
this.pathWhitelist = new List<string>(pathWhitelist);
this.enlistment = enlistment;
this.targetCommitSha = targetCommitSha;
this.DirectoryOperations = new ConcurrentQueue<Git.DiffTreeResult>();
this.FileDeleteOperations = new ConcurrentQueue<string>();
this.FileAddOperations = new ConcurrentDictionary<string, HashSet<string>>(StringComparer.OrdinalIgnoreCase);
this.RequiredBlobs = new BlockingCollection<string>();
}
public bool HasFailures { get; private set; }
public ConcurrentQueue<DiffTreeResult> DirectoryOperations { get; }
public ConcurrentQueue<string> FileDeleteOperations { get; }
/// <summary>
/// Mapping from available sha to filenames where blob should be written
/// </summary>
public ConcurrentDictionary<string, HashSet<string>> FileAddOperations { get; }
/// <summary>
/// Blobs required to perform a checkout of the destination
/// </summary>
public BlockingCollection<string> RequiredBlobs { get; }
public int TotalDirectoryOperations
{
get { return this.DirectoryOperations.Count + this.additionalDirDeletes; }
}
public int TotalFileDeletes
{
get { return this.FileDeleteOperations.Count + this.additionalFileDeletes; }
}
public void PerformDiff()
{
using (GitCatFileBatchProcess catFile = new GitCatFileBatchProcess(this.enlistment))
{
GitProcess git = new GitProcess(this.enlistment);
string repoRoot = git.GetRepoRoot();
string targetTreeSha = catFile.GetTreeSha(this.targetCommitSha);
string headTreeSha = catFile.GetTreeSha("HEAD");
EventMetadata metadata = new EventMetadata();
metadata.Add("TargetTreeSha", targetTreeSha);
metadata.Add("HeadTreeSha", headTreeSha);
using (ITracer activity = this.tracer.StartActivity("PerformDiff", EventLevel.Informational, metadata))
{
metadata = new EventMetadata();
if (headTreeSha == null)
{
// Nothing is checked out (fresh git init), so we must search the entire tree.
git.LsTree(targetTreeSha, this.EnqueueOperationsFromLsTreeLine, recursive: true, showAllTrees: true);
metadata.Add("Operation", "LsTree");
}
else
{
// Diff head and target, determine what needs to be done.
git.DiffTree(headTreeSha, targetTreeSha, line => this.EnqueueOperationsFromDiffTreeLine(this.tracer, repoRoot, line));
metadata.Add("Operation", "DiffTree");
}
this.RequiredBlobs.CompleteAdding();
metadata.Add("Success", !this.HasFailures);
metadata.Add("DirectoryOperationsCount", this.TotalDirectoryOperations);
metadata.Add("FileDeleteOperationsCount", this.TotalFileDeletes);
metadata.Add("RequiredBlobsCount", this.RequiredBlobs.Count);
activity.Stop(metadata);
}
}
}
public void ParseDiffFile(string filename, string repoRoot)
{
using (ITracer activity = this.tracer.StartActivity("PerformDiff", EventLevel.Informational))
{
using (StreamReader file = new StreamReader(File.OpenRead(filename)))
{
while (!file.EndOfStream)
{
this.EnqueueOperationsFromDiffTreeLine(activity, repoRoot, file.ReadLine());
}
}
}
}
private void EnqueueOperationsFromLsTreeLine(string line)
{
DiffTreeResult result = DiffTreeResult.ParseFromLsTreeLine(line, this.enlistment.EnlistmentRoot);
if (result == null)
{
this.tracer.RelatedError("Unrecognized ls-tree line: {0}", line);
}
if (!this.ResultIsInWhitelist(result))
{
return;
}
if (result.TargetIsDirectory)
{
this.DirectoryOperations.Enqueue(result);
}
else
{
this.EnqueueFileAddOperation(result);
}
}
private void EnqueueOperationsFromDiffTreeLine(ITracer activity, string repoRoot, string line)
{
if (!line.StartsWith(":"))
{
// Diff-tree starts with metadata we can ignore.
// Real diff lines always start with a colon
return;
}
DiffTreeResult result = DiffTreeResult.ParseFromDiffTreeLine(line, repoRoot);
if (!this.ResultIsInWhitelist(result))
{
return;
}
if (result.Operation == DiffTreeResult.Operations.Unknown ||
result.Operation == DiffTreeResult.Operations.Unmerged)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Path", result.TargetFilename);
metadata.Add("ErrorMessage", "Unexpected diff operation: " + result.Operation);
activity.RelatedError(metadata);
this.HasFailures = true;
return;
}
if (result.Operation == DiffTreeResult.Operations.Delete)
{
// Don't enqueue deletes that will be handled by recursively deleting their parent.
// Git traverses diffs in pre-order, so we are guaranteed to ignore child deletes here.
// Append trailing slash terminator to avoid matches with directory prefixes (Eg. \GVFS and \GVFS.Common)
string pathWithSlash = result.TargetFilename + "\\";
if (this.deletedPaths.Any(path => pathWithSlash.StartsWith(path, StringComparison.OrdinalIgnoreCase)))
{
if (result.SourceIsDirectory || result.TargetIsDirectory)
{
Interlocked.Increment(ref this.additionalDirDeletes);
}
else
{
Interlocked.Increment(ref this.additionalFileDeletes);
}
return;
}
this.deletedPaths.Add(pathWithSlash);
}
// Separate and enqueue all directory operations first.
if (result.SourceIsDirectory || result.TargetIsDirectory)
{
// Handle when a directory becomes a file.
// Files becoming directories is handled by HandleAllDirectoryOperations
if (result.Operation == DiffTreeResult.Operations.RenameEdit &&
!result.TargetIsDirectory)
{
this.EnqueueFileAddOperation(result);
}
this.DirectoryOperations.Enqueue(result);
}
else
{
switch (result.Operation)
{
case DiffTreeResult.Operations.Delete:
this.FileDeleteOperations.Enqueue(result.TargetFilename);
break;
case DiffTreeResult.Operations.RenameEdit:
this.FileDeleteOperations.Enqueue(result.SourceFilename);
this.EnqueueFileAddOperation(result);
break;
case DiffTreeResult.Operations.Modify:
case DiffTreeResult.Operations.CopyEdit:
case DiffTreeResult.Operations.Add:
this.EnqueueFileAddOperation(result);
break;
default:
activity.RelatedError("Unexpected diff operation from line: {0}", line);
break;
}
}
}
private bool ResultIsInWhitelist(DiffTreeResult blobAdd)
{
return blobAdd.TargetFilename == null ||
!this.pathWhitelist.Any() ||
this.pathWhitelist.Any(path => blobAdd.TargetFilename.StartsWith(path, StringComparison.OrdinalIgnoreCase));
}
/// <remarks>
/// This is not used in a multithreaded method, it doesn't need to be thread-safe
/// </remarks>
private void EnqueueFileAddOperation(DiffTreeResult operation)
{
// Each filepath should be case-insensitive unique. If there are duplicates, only the last parsed one should remain.
if (!this.filesAdded.Add(operation.TargetFilename))
{
foreach (KeyValuePair<string, HashSet<string>> kvp in this.FileAddOperations)
{
if (kvp.Value.Remove(operation.TargetFilename))
{
break;
}
}
}
HashSet<string> operations = new HashSet<string>(StringComparer.OrdinalIgnoreCase) { operation.TargetFilename };
this.FileAddOperations.AddOrUpdate(
operation.TargetSha,
operations,
(key, oldValue) =>
{
oldValue.Add(operation.TargetFilename);
return oldValue;
});
this.RequiredBlobs.Add(operation.TargetSha);
}
}
}

Просмотреть файл

@ -0,0 +1,47 @@
using GVFS.Common.Physical.Git;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
namespace FastFetch.Git
{
public class GitPackIndex
{
private const uint PackIndexSignature = 0xff744f63;
private const int Sha1ByteLength = 20;
public static IEnumerable<string> GetShas(string filePath)
{
using (FileStream stream = File.OpenRead(filePath))
using (BigEndianReader binReader = new BigEndianReader(stream))
{
VerifyHeader(binReader);
// Fanout table has 256 4-byte buckets corresponding to the number of objects prefixed by the bucket number
// Number is cumulative, so the total is always the last bucket value.
stream.Position += 255 * sizeof(uint);
uint totalObjects = binReader.ReadUInt32();
for (int i = 0; i < totalObjects; ++i)
{
yield return BitConverter.ToString(binReader.ReadBytes(Sha1ByteLength)).Replace("-", string.Empty);
}
}
}
private static void VerifyHeader(BinaryReader binReader)
{
uint signature = binReader.ReadUInt32();
if (signature != PackIndexSignature)
{
throw new InvalidDataException("Bad pack header");
}
uint version = binReader.ReadUInt32();
if (version != 2)
{
throw new InvalidDataException("Unsupported pack index version");
}
}
}
}

Просмотреть файл

@ -0,0 +1,71 @@
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
namespace FastFetch.Jobs
{
public class LsTreeHelper
{
private const string AreaPath = nameof(LsTreeHelper);
private List<string> pathWhitelist;
private ITracer tracer;
private Enlistment enlistment;
public LsTreeHelper(
IEnumerable<string> pathWhitelist,
ITracer tracer,
Enlistment enlistment)
{
this.pathWhitelist = new List<string>(pathWhitelist);
this.tracer = tracer;
this.enlistment = enlistment;
this.BlobIdOutput = new BlockingCollection<string>();
}
public BlockingCollection<string> BlobIdOutput { get; set; }
public bool EnqueueAllBlobs(string rootTreeSha)
{
GitProcess git = new GitProcess(this.enlistment);
EventMetadata metadata = new EventMetadata();
metadata.Add("TreeSha", rootTreeSha);
using (ITracer activity = this.tracer.StartActivity(AreaPath, EventLevel.Informational, metadata))
{
GitProcess.Result result = git.LsTree(rootTreeSha, this.AddIfLineIsBlob, recursive: true);
if (result.HasErrors)
{
metadata.Add("ErrorMessage", result.Errors);
activity.RelatedError(metadata);
return false;
}
}
this.BlobIdOutput.CompleteAdding();
return true;
}
private void AddIfLineIsBlob(string blobLine)
{
int blobIdIndex = blobLine.IndexOf(GitCatFileProcess.BlobMarker);
if (blobIdIndex > -1)
{
string blobSha = blobLine.Substring(blobIdIndex + GitCatFileProcess.TreeMarker.Length, GVFSConstants.ShaStringLength);
string blobName = blobLine.Substring(blobLine.LastIndexOf('\t')).Trim();
if (!this.pathWhitelist.Any() ||
this.pathWhitelist.Any(whitePath => blobName.StartsWith(whitePath, StringComparison.OrdinalIgnoreCase)))
{
this.BlobIdOutput.Add(blobSha);
}
}
}
}
}

Просмотреть файл

@ -0,0 +1,41 @@
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Linq;
namespace FastFetch.Git
{
public static class RefSpecHelpers
{
public const string RefsHeadsGitPath = "refs/heads/";
public static bool UpdateRefSpec(ITracer tracer, Enlistment enlistment, string branchOrCommit, GitRefs refs)
{
using (ITracer activity = tracer.StartActivity("UpdateRefSpec", EventLevel.Informational))
{
const string OriginRefMapSettingName = "remote.origin.fetch";
// We must update the refspec to get proper "git pull" functionality.
string localBranch = branchOrCommit.StartsWith(RefsHeadsGitPath) ? branchOrCommit : (RefsHeadsGitPath + branchOrCommit);
string remoteBranch = refs.GetBranchRefPairs().Single().Key;
string refSpec = "+" + localBranch + ":" + remoteBranch;
GitProcess git = new GitProcess(enlistment);
// Replace all ref-specs this
// * ensures the default refspec (remote.origin.fetch=+refs/heads/*:refs/remotes/origin/*) is removed which avoids some "git fetch/pull" failures
// * gives added "git fetch" performance since git will only fetch the branch provided in the refspec.
GitProcess.Result setResult = git.SetInLocalConfig(OriginRefMapSettingName, refSpec, replaceAll: true);
if (setResult.HasErrors)
{
activity.RelatedError("Could not update ref spec to {0}: {1}", refSpec, setResult.Errors);
return false;
}
}
return true;
}
}
}

Просмотреть файл

@ -0,0 +1,55 @@
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
namespace FastFetch.Jobs
{
public class UpdateRefsHelper
{
private const string AreaPath = nameof(UpdateRefsHelper);
private Enlistment enlistment;
public UpdateRefsHelper(Enlistment enlistment)
{
this.enlistment = enlistment;
}
/// <returns>True on success, false otherwise</returns>
public bool UpdateRef(ITracer tracer, string refName, string targetCommitish)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("RefName", refName);
metadata.Add("TargetCommitish", targetCommitish);
using (ITracer activity = tracer.StartActivity(AreaPath, EventLevel.Informational, metadata))
{
GitProcess gitProcess = new GitProcess(this.enlistment);
GitProcess.Result result = null;
if (this.IsSymbolicRef(targetCommitish))
{
// Using update-ref with a branch name will leave a SHA in the ref file which detaches HEAD, so use symbolic-ref instead.
result = gitProcess.UpdateBranchSymbolicRef(refName, targetCommitish);
}
else
{
result = gitProcess.UpdateBranchSha(refName, targetCommitish);
}
if (result.HasErrors)
{
activity.RelatedError(result.Errors);
return false;
}
return true;
}
}
private bool IsSymbolicRef(string targetCommitish)
{
return targetCommitish.StartsWith("refs/", StringComparison.OrdinalIgnoreCase);
}
}
}

Просмотреть файл

@ -0,0 +1,33 @@
using GVFS.Common;
using System;
using System.IO;
using System.Linq;
namespace FastFetch
{
public class GitEnlistment : Enlistment
{
private GitEnlistment(string repoRoot, string cacheBaseUrl, string gitBinPath)
: base(repoRoot, repoRoot, cacheBaseUrl, gitBinPath, gvfsHooksRoot: null)
{
}
public static GitEnlistment CreateFromCurrentDirectory(string objectsEndpoint, string gitBinPath)
{
DirectoryInfo dirInfo = new DirectoryInfo(Environment.CurrentDirectory);
while (dirInfo != null && dirInfo.Exists)
{
DirectoryInfo[] dotGitDirs = dirInfo.GetDirectories(GVFSConstants.DotGit.Root);
if (dotGitDirs.Count() == 1)
{
return new GitEnlistment(dirInfo.FullName, objectsEndpoint, gitBinPath);
}
dirInfo = dirInfo.Parent;
}
return null;
}
}
}

Просмотреть файл

@ -0,0 +1,253 @@
using FastFetch.Jobs.Data;
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
namespace FastFetch.Jobs
{
/// <summary>
/// Takes in blocks of object shas, downloads object shas as a pack or loose object, outputs pack locations (if applicable).
/// </summary>
public class BatchObjectDownloadJob : Job
{
private const string AreaPath = "BatchObjectDownloadJob";
private const string DownloadAreaPath = "Download";
private static readonly TimeSpan HeartBeatPeriod = TimeSpan.FromSeconds(20);
private readonly BlockingAggregator<string, BlobDownloadRequest> inputQueue;
private int activeDownloadCount;
private ITracer tracer;
private Enlistment enlistment;
private HttpGitObjects httpGitObjects;
private GitObjects gitObjects;
private Timer heartbeat;
private long bytesDownloaded = 0;
public BatchObjectDownloadJob(
int maxParallel,
int chunkSize,
BlockingCollection<string> inputQueue,
BlockingCollection<string> availableBlobs,
ITracer tracer,
Enlistment enlistment,
HttpGitObjects httpGitObjects,
GitObjects gitObjects)
: base(maxParallel)
{
this.tracer = tracer.StartActivity(AreaPath, EventLevel.Informational);
this.inputQueue = new BlockingAggregator<string, BlobDownloadRequest>(inputQueue, chunkSize, objectIds => new BlobDownloadRequest(objectIds));
this.enlistment = enlistment;
this.httpGitObjects = httpGitObjects;
this.gitObjects = gitObjects;
this.AvailablePacks = new BlockingCollection<IndexPackRequest>();
this.AvailableObjects = availableBlobs;
}
public BlockingCollection<IndexPackRequest> AvailablePacks { get; }
public BlockingCollection<string> AvailableObjects { get; }
protected override void DoBeforeWork()
{
this.heartbeat = new Timer(this.EmitHeartbeat, null, TimeSpan.Zero, HeartBeatPeriod);
base.DoBeforeWork();
}
protected override void DoWork()
{
BlobDownloadRequest request;
while (this.inputQueue.TryTake(out request))
{
Interlocked.Increment(ref this.activeDownloadCount);
EventMetadata metadata = new EventMetadata();
metadata.Add("PackId", request.PackId);
metadata.Add("ActiveDownloads", this.activeDownloadCount);
metadata.Add("NumberOfObjects", request.ObjectIds.Count);
using (ITracer activity = this.tracer.StartActivity(DownloadAreaPath, EventLevel.Informational, metadata))
{
try
{
RetryWrapper<HttpGitObjects.GitObjectTaskResult>.InvocationResult result;
if (request.ObjectIds.Count == 1)
{
result = this.httpGitObjects.TryDownloadLooseObject(
request.ObjectIds[0],
onSuccess: (tryCount, response) => this.WriteObjectOrPackAsync(request, tryCount, response),
onFailure: RetryWrapper<HttpGitObjects.GitObjectTaskResult>.StandardErrorHandler(activity, DownloadAreaPath));
}
else
{
HashSet<string> successfulDownloads = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
result = this.httpGitObjects.TryDownloadObjects(
() => request.ObjectIds.Except(successfulDownloads),
commitDepth: 1,
onSuccess: (tryCount, response) => this.WriteObjectOrPackAsync(request, tryCount, response, successfulDownloads),
onFailure: RetryWrapper<HttpGitObjects.GitObjectTaskResult>.StandardErrorHandler(activity, DownloadAreaPath),
preferBatchedLooseObjects: true);
}
if (!result.Succeeded)
{
this.HasFailures = true;
}
metadata.Add("Success", result.Succeeded);
metadata.Add("AttemptNumber", result.Attempts);
metadata["ActiveDownloads"] = this.activeDownloadCount - 1;
activity.Stop(metadata);
}
finally
{
Interlocked.Decrement(ref this.activeDownloadCount);
}
}
}
}
protected override void DoAfterWork()
{
this.heartbeat.Dispose();
this.heartbeat = null;
this.AvailablePacks.CompleteAdding();
EventMetadata metadata = new EventMetadata();
metadata.Add("RequestCount", BlobDownloadRequest.TotalRequests);
metadata.Add("BytesDownloaded", this.bytesDownloaded);
this.tracer.Stop(metadata);
}
private RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult WriteObjectOrPackAsync(
BlobDownloadRequest request,
int tryCount,
HttpGitObjects.GitEndPointResponseData response,
HashSet<string> successfulDownloads = null)
{
string fileName = null;
switch (response.ContentType)
{
case HttpGitObjects.ContentType.LooseObject:
string sha = request.ObjectIds.First();
fileName = this.gitObjects.WriteLooseObject(
this.enlistment.WorkingDirectoryRoot,
response.Stream,
sha);
this.AvailableObjects.Add(sha);
break;
case HttpGitObjects.ContentType.PackFile:
fileName = this.gitObjects.WriteTempPackFile(response);
this.AvailablePacks.Add(new IndexPackRequest(fileName, request));
break;
case HttpGitObjects.ContentType.BatchedLooseObjects:
// To reduce allocations, reuse the same buffer when writing objects in this batch
byte[] bufToCopyWith = new byte[StreamUtil.DefaultCopyBufferSize];
OnLooseObject onLooseObject = (objectStream, sha1) =>
{
this.gitObjects.WriteLooseObject(
this.enlistment.WorkingDirectoryRoot,
objectStream,
sha1,
bufToCopyWith);
this.AvailableObjects.Add(sha1);
if (successfulDownloads != null)
{
successfulDownloads.Add(sha1);
}
// This isn't strictly correct because we don't add object header bytes,
// just the actual compressed content length, but we expect the amount of
// header data to be negligible compared to the objects themselves.
Interlocked.Add(ref this.bytesDownloaded, objectStream.Length);
};
new BatchedLooseObjectDeserializer(response.Stream, onLooseObject).ProcessObjects();
break;
}
if (fileName != null)
{
// NOTE: If we are writing a file as part of this method, the only case
// where it's not expected to exist is when running unit tests
FileInfo info = new FileInfo(fileName);
if (info.Exists)
{
Interlocked.Add(ref this.bytesDownloaded, info.Length);
}
else
{
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(
new HttpGitObjects.GitObjectTaskResult(false));
}
}
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(
new HttpGitObjects.GitObjectTaskResult(true));
}
private void EmitHeartbeat(object state)
{
EventMetadata metadata = new EventMetadata();
metadata["ActiveDownloads"] = this.activeDownloadCount;
this.tracer.RelatedEvent(EventLevel.Verbose, "DownloadHeartbeat", metadata);
}
private class BlockingAggregator<InputType, OutputType>
{
private BlockingCollection<InputType> inputQueue;
private int chunkSize;
private Func<List<InputType>, OutputType> factory;
public BlockingAggregator(BlockingCollection<InputType> input, int chunkSize, Func<List<InputType>, OutputType> factory)
{
this.inputQueue = input;
this.chunkSize = chunkSize;
this.factory = factory;
}
public bool TryTake(out OutputType output)
{
List<InputType> intermediary = new List<InputType>();
for (int i = 0; i < this.chunkSize; ++i)
{
InputType data;
if (this.inputQueue.TryTake(out data, millisecondsTimeout: -1))
{
intermediary.Add(data);
}
else
{
break;
}
}
if (intermediary.Any())
{
output = this.factory(intermediary);
return true;
}
output = default(OutputType);
return false;
}
}
}
}

Просмотреть файл

@ -0,0 +1,29 @@
using System.Collections.Generic;
using System.Linq;
using System.Threading;
namespace FastFetch.Jobs.Data
{
public class BlobDownloadRequest
{
private static int requestCounter = 0;
public BlobDownloadRequest(IReadOnlyList<string> objectIds)
{
this.ObjectIds = objectIds;
this.PackId = Interlocked.Increment(ref requestCounter);
}
public static int TotalRequests
{
get
{
return requestCounter;
}
}
public IReadOnlyList<string> ObjectIds { get; }
public int PackId { get; }
}
}

Просмотреть файл

@ -0,0 +1,21 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace FastFetch.Jobs.Data
{
public class IndexPackRequest
{
public IndexPackRequest(string tempPackFile, BlobDownloadRequest downloadRequest)
{
this.TempPackFile = tempPackFile;
this.DownloadRequest = downloadRequest;
}
public BlobDownloadRequest DownloadRequest { get; }
public string TempPackFile { get; }
}
}

Просмотреть файл

@ -0,0 +1,18 @@
namespace FastFetch.Jobs.Data
{
public class SearchTreeRequest
{
public SearchTreeRequest(string treeSha, string rootPath, bool shouldRecurse)
{
this.TreeSha = treeSha;
this.RootPath = rootPath;
this.ShouldRecurse = shouldRecurse;
}
public bool ShouldRecurse { get; }
public string TreeSha { get; }
public string RootPath { get; }
}
}

Просмотреть файл

@ -0,0 +1,89 @@
using FastFetch.Jobs.Data;
using GVFS.Common;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
namespace FastFetch.Jobs
{
/// <summary>
/// Takes in search requests, searches each tree as requested, outputs blocks of missing blob shas.
/// </summary>
public class FindMissingBlobsJob : Job
{
private const string AreaPath = nameof(FindMissingBlobsJob);
private const string TreeSearchAreaPath = "TreeSearch";
private ITracer tracer;
private Enlistment enlistment;
private int missingBlobCount;
private int availableBlobCount;
private BlockingCollection<string> inputQueue;
private ConcurrentHashSet<string> alreadyFoundBlobIds;
private ProcessPool<GitCatFileBatchCheckProcess> catFilePool;
public FindMissingBlobsJob(
int maxParallel,
BlockingCollection<string> inputQueue,
BlockingCollection<string> availableBlobs,
ITracer tracer,
Enlistment enlistment)
: base(maxParallel)
{
this.tracer = tracer.StartActivity(AreaPath, EventLevel.Informational);
this.inputQueue = inputQueue;
this.enlistment = enlistment;
this.alreadyFoundBlobIds = new ConcurrentHashSet<string>();
this.DownloadQueue = new BlockingCollection<string>();
this.AvailableBlobs = availableBlobs;
this.catFilePool = new ProcessPool<GitCatFileBatchCheckProcess>(
tracer,
() => new GitCatFileBatchCheckProcess(this.enlistment),
maxParallel);
}
public BlockingCollection<string> DownloadQueue { get; }
public BlockingCollection<string> AvailableBlobs { get; }
protected override void DoWork()
{
string blobId;
while (this.inputQueue.TryTake(out blobId, Timeout.Infinite))
{
this.catFilePool.Invoke(catFileProcess =>
{
if (!catFileProcess.ObjectExists(blobId))
{
Interlocked.Increment(ref this.missingBlobCount);
this.DownloadQueue.Add(blobId);
}
else
{
Interlocked.Increment(ref this.availableBlobCount);
this.AvailableBlobs.Add(blobId);
}
});
}
}
protected override void DoAfterWork()
{
this.DownloadQueue.CompleteAdding();
this.catFilePool.Dispose();
EventMetadata metadata = new EventMetadata();
metadata.Add("TotalMissingObjects", this.missingBlobCount);
metadata.Add("AvailableObjects", this.availableBlobCount);
this.tracer.Stop(metadata);
}
}
}

Просмотреть файл

@ -0,0 +1,79 @@
using FastFetch.Jobs.Data;
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System.Collections.Concurrent;
using System.Threading;
namespace FastFetch.Jobs
{
public class IndexPackJob : Job
{
private const string AreaPath = "IndexPackJob";
private const string IndexPackAreaPath = "IndexPack";
private readonly BlockingCollection<IndexPackRequest> inputQueue;
private ITracer tracer;
private GitObjects gitObjects;
private long shasIndexed = 0;
public IndexPackJob(
int maxParallel,
BlockingCollection<IndexPackRequest> inputQueue,
BlockingCollection<string> availableBlobs,
ITracer tracer,
GitObjects gitObjects)
: base(maxParallel)
{
this.tracer = tracer.StartActivity(AreaPath, EventLevel.Informational);
this.inputQueue = inputQueue;
this.gitObjects = gitObjects;
this.AvailableBlobs = availableBlobs;
}
public BlockingCollection<string> AvailableBlobs { get; }
protected override void DoWork()
{
IndexPackRequest request;
while (this.inputQueue.TryTake(out request, millisecondsTimeout: -1))
{
EventMetadata metadata = new EventMetadata();
metadata.Add("PackId", request.DownloadRequest.PackId);
using (ITracer activity = this.tracer.StartActivity(IndexPackAreaPath, EventLevel.Informational, metadata))
{
GitProcess.Result result = this.gitObjects.IndexTempPackFile(request.TempPackFile);
if (result.HasErrors)
{
EventMetadata errorMetadata = new EventMetadata();
errorMetadata.Add("PackId", request.DownloadRequest.PackId);
errorMetadata.Add("ErrorMessage", result.Errors);
activity.RelatedError(errorMetadata);
this.HasFailures = true;
}
if (!this.HasFailures)
{
foreach (string blobId in request.DownloadRequest.ObjectIds)
{
this.AvailableBlobs.Add(blobId);
Interlocked.Increment(ref this.shasIndexed);
}
}
metadata.Add("Success", !this.HasFailures);
activity.Stop(metadata);
}
}
}
protected override void DoAfterWork()
{
EventMetadata metadata = new EventMetadata();
metadata.Add("ShasIndexed", this.shasIndexed);
this.tracer.Stop(metadata);
}
}
}

Просмотреть файл

@ -0,0 +1,59 @@
using System;
using System.Threading;
namespace FastFetch.Jobs
{
public abstract class Job
{
private int maxParallel;
private Thread[] workers;
public Job(int maxParallel)
{
this.maxParallel = maxParallel;
}
public bool HasFailures { get; protected set; }
public void Start()
{
if (this.workers != null)
{
throw new InvalidOperationException("Cannot call start twice");
}
this.DoBeforeWork();
this.workers = new Thread[this.maxParallel];
for (int i = 0; i < this.workers.Length; ++i)
{
this.workers[i] = new Thread(this.DoWork);
this.workers[i].Start();
}
}
public void WaitForCompletion()
{
if (this.workers == null)
{
throw new InvalidOperationException("Cannot wait for completion before start is called");
}
foreach (Thread t in this.workers)
{
t.Join();
}
this.DoAfterWork();
this.workers = null;
}
protected virtual void DoBeforeWork()
{
}
protected abstract void DoWork();
protected abstract void DoAfterWork();
}
}

13
GVFS/FastFetch/Program.cs Normal file
Просмотреть файл

@ -0,0 +1,13 @@
using CommandLine;
namespace FastFetch
{
public class Program
{
public static void Main(string[] args)
{
Parser.Default.ParseArguments<FastFetchVerb>(args)
.WithParsed(fastFetch => fastFetch.Execute());
}
}
}

Просмотреть файл

@ -0,0 +1,22 @@
using System.Reflection;
using System.Runtime.InteropServices;
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("FastFetch")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("FastFetch")]
[assembly: AssemblyCopyright("Copyright © Microsoft 2016")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]
// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("07f2a520-2ab7-46dd-97c0-75d8e988d55b")]

Просмотреть файл

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="CommandLineParser" version="2.0.275-beta" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventRegister" version="1.1.28" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventSource" version="1.1.28" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventSource.Redist" version="1.1.28" targetFramework="net452" />
<package id="StyleCop.Error.MSBuild" version="1.0.0" targetFramework="net452" />
<package id="StyleCop.MSBuild" version="4.7.54.0" targetFramework="net452" developmentDependency="true" />
</packages>

Просмотреть файл

@ -0,0 +1,53 @@
using System;
namespace GVFS.Common
{
public static class AntiVirusExclusions
{
public static void AddAntiVirusExclusion(string path)
{
try
{
CallPowershellCommand("Add-MpPreference -ExclusionPath \"" + path + "\"");
}
catch (Exception e)
{
Console.WriteLine("Unable to add exclusion: " + e.ToString());
}
}
public static bool TryGetIsPathExcluded(string path, out bool isExcluded)
{
isExcluded = false;
try
{
ProcessResult getMpPrefrencesResult = CallPowershellCommand("Get-MpPreference | Select -ExpandProperty ExclusionPath");
if (getMpPrefrencesResult.ExitCode == 0)
{
foreach (string excludedPath in getMpPrefrencesResult.Output.Split(new[] { '\r', '\n' }, StringSplitOptions.RemoveEmptyEntries))
{
if (excludedPath.Trim().Equals(path, StringComparison.OrdinalIgnoreCase))
{
isExcluded = true;
break;
}
}
}
}
catch (Exception e)
{
Console.WriteLine("Unable to get exclusions:" + e.ToString());
return false;
}
return true;
}
private static ProcessResult CallPowershellCommand(string command)
{
return ProcessHelper.Run("powershell", "-NoProfile -Command \"& { " + command + " }\"");
}
}
}

Просмотреть файл

@ -0,0 +1,132 @@
using System;
using System.IO;
using System.Linq;
using System.Text;
namespace GVFS.Common
{
/// <summary>
/// Invoked when the full content of a single loose object is available.
/// </summary>
public delegate void OnLooseObject(Stream objectStream, string sha1);
/// <summary>
/// Deserializer for concatenated loose objects.
/// </summary>
public class BatchedLooseObjectDeserializer
{
private const int NumObjectIdBytes = 20;
private const int NumObjectHeaderBytes = NumObjectIdBytes + sizeof(long);
private static readonly byte[] ExpectedHeader
= new byte[]
{
(byte)'G', (byte)'V', (byte)'F', (byte)'S', (byte)' ', // Magic
1 // Version
};
private readonly Stream source;
private readonly OnLooseObject onLooseObject;
public BatchedLooseObjectDeserializer(Stream source, OnLooseObject onLooseObject)
{
this.source = source;
this.onLooseObject = onLooseObject;
}
/// <summary>
/// Read all the objects from the source stream and call <see cref="OnLooseObject"/> for each.
/// </summary>
/// <returns>The total number of objects read</returns>
public int ProcessObjects()
{
this.ValidateHeader();
// Start reading objects
int numObjectsRead = 0;
byte[] curObjectHeader = new byte[NumObjectHeaderBytes];
while (true)
{
bool keepReading = this.ShouldContinueReading(curObjectHeader);
if (!keepReading)
{
break;
}
// Get the length
long curLength = BitConverter.ToInt64(curObjectHeader, NumObjectIdBytes);
// Handle the loose object
using (Stream rawObjectData = new RestrictedStream(this.source, 0, curLength, leaveOpen: true))
{
string objectId = SHA1Util.HexStringFromBytes(curObjectHeader, NumObjectIdBytes);
if (objectId.Equals(GVFSConstants.AllZeroSha))
{
throw new RetryableException("Received all-zero SHA before end of stream");
}
this.onLooseObject(rawObjectData, objectId);
numObjectsRead++;
}
}
return numObjectsRead;
}
/// <summary>
/// Parse the current object header to check if we've reached the end.
/// </summary>
/// <returns>true if the end of the stream has been reached, false if not</returns>
private bool ShouldContinueReading(byte[] curObjectHeader)
{
int totalBytes = StreamUtil.TryReadGreedy(
this.source,
curObjectHeader,
0,
curObjectHeader.Length);
if (totalBytes == NumObjectHeaderBytes)
{
// Successful header read
return true;
}
else if (totalBytes == NumObjectIdBytes)
{
// We may have finished reading all the objects
for (int i = 0; i < NumObjectIdBytes; i++)
{
if (curObjectHeader[i] != 0)
{
throw new RetryableException(
string.Format(
"Reached end of stream before we got the expected zero-object ID Buffer: {0}",
SHA1Util.HexStringFromBytes(curObjectHeader)));
}
}
return false;
}
else
{
throw new RetryableException(
string.Format(
"Reached end of stream before expected {0} or {1} bytes. Got {2}. Buffer: {3}",
NumObjectHeaderBytes,
NumObjectIdBytes,
totalBytes,
SHA1Util.HexStringFromBytes(curObjectHeader)));
}
}
private void ValidateHeader()
{
byte[] headerBuf = new byte[ExpectedHeader.Length];
StreamUtil.TryReadGreedy(this.source, headerBuf, 0, headerBuf.Length);
if (!headerBuf.SequenceEqual(ExpectedHeader))
{
throw new InvalidDataException("Unexpected header: " + Encoding.UTF8.GetString(headerBuf));
}
}
}
}

Просмотреть файл

@ -0,0 +1,164 @@
using System;
using System.IO;
namespace GVFS.Common
{
/// <summary>
/// Stream wrapper for a length-limited subview of another stream.
/// </summary>
internal class RestrictedStream : Stream
{
private readonly Stream stream;
private readonly long length;
private readonly bool leaveOpen;
private long position;
private bool closed;
public RestrictedStream(Stream stream, long offset, long length, bool leaveOpen = false)
{
this.stream = stream;
this.length = length;
this.leaveOpen = leaveOpen;
if (offset != 0)
{
if (!this.stream.CanSeek)
{
throw new InvalidOperationException();
}
this.stream.Seek(offset, SeekOrigin.Current);
}
}
public override bool CanRead
{
get
{
return true;
}
}
public override bool CanSeek
{
get
{
return this.stream.CanSeek;
}
}
public override bool CanWrite
{
get
{
return false;
}
}
public override long Length
{
get
{
return this.length;
}
}
public override long Position
{
get
{
return this.position;
}
set
{
this.Seek(value, SeekOrigin.Begin);
}
}
public override void Close()
{
if (!this.closed)
{
this.closed = true;
if (!this.leaveOpen)
{
this.stream.Close();
}
}
base.Close();
}
public override int Read(byte[] buffer, int offset, int count)
{
int bytesToRead = (int)(Math.Min(this.position + count, this.length) - this.position);
// Some streams like HttpContent.ReadOnlyStream throw InvalidOperationException
// when reading 0 bytes from huge streams. If that changes we can remove this check.
if (bytesToRead == 0)
{
return 0;
}
int toReturn = this.stream.Read(buffer, offset, bytesToRead);
this.position += toReturn;
return toReturn;
}
public override long Seek(long offset, SeekOrigin origin)
{
if (!this.stream.CanSeek)
{
throw new InvalidOperationException();
}
long newPosition;
switch (origin)
{
case SeekOrigin.Begin:
newPosition = offset;
break;
case SeekOrigin.Current:
newPosition = this.position + offset;
break;
case SeekOrigin.End:
newPosition = this.length + offset;
break;
default:
throw new InvalidOperationException();
}
newPosition = Math.Max(Math.Min(this.length, newPosition), 0);
this.stream.Seek(newPosition - this.position, SeekOrigin.Current);
this.position = newPosition;
return newPosition;
}
public override void Flush()
{
throw new NotSupportedException();
}
public override void SetLength(long value)
{
throw new NotSupportedException();
}
public override void Write(byte[] buffer, int offset, int count)
{
throw new NotSupportedException();
}
}
}

Просмотреть файл

@ -0,0 +1,9 @@
namespace GVFS.Common
{
public enum CallbackResult
{
Success,
RetryableError,
FatalError
}
}

Просмотреть файл

@ -0,0 +1,58 @@
using System;
using System.Collections;
using System.Collections.Concurrent;
using System.Collections.Generic;
namespace GVFS.Common
{
public class ConcurrentHashSet<T> : IEnumerable<T>
{
private ConcurrentDictionary<T, bool> dictionary;
public ConcurrentHashSet()
{
this.dictionary = new ConcurrentDictionary<T, bool>();
}
public ConcurrentHashSet(IEqualityComparer<T> comparer)
{
this.dictionary = new ConcurrentDictionary<T, bool>(comparer);
}
public int Count
{
get { return this.dictionary.Count; }
}
public bool Add(T entry)
{
return this.dictionary.TryAdd(entry, true);
}
public bool Contains(T item)
{
return this.dictionary.ContainsKey(item);
}
public void Clear()
{
this.dictionary.Clear();
}
public IEnumerator<T> GetEnumerator()
{
return this.dictionary.Keys.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return this.GetEnumerator();
}
public bool TryRemove(T key)
{
bool value;
return this.dictionary.TryRemove(key, out value);
}
}
}

Просмотреть файл

@ -0,0 +1,135 @@
using GVFS.Common.Git;
using System;
using System.IO;
namespace GVFS.Common
{
public abstract class Enlistment
{
private const string ObjectsEndpointSuffix = "/gvfs/objects";
private const string PrefetchEndpointSuffix = "/gvfs/prefetch";
private const string DeprecatedObjectsEndpointGitConfigName = "gvfs.objects-endpoint";
private const string GVFSGitConfigPrefix = "gvfs.";
private const string CacheEndpointGitConfigSuffix = ".cache-server-url";
// New enlistment
protected Enlistment(string enlistmentRoot, string workingDirectoryRoot, string repoUrl, string cacheServerUrl, string gitBinPath, string gvfsHooksRoot)
{
if (string.IsNullOrWhiteSpace(gitBinPath))
{
throw new ArgumentException("Path to git.exe must be set");
}
this.EnlistmentRoot = enlistmentRoot;
this.WorkingDirectoryRoot = workingDirectoryRoot;
this.GitBinPath = gitBinPath;
this.GVFSHooksRoot = gvfsHooksRoot;
this.RepoUrl = repoUrl;
this.SetComputedPaths();
this.SetComputedURLs(cacheServerUrl);
}
// Existing, configured enlistment
protected Enlistment(string enlistmentRoot, string workingDirectoryRoot, string cacheServerUrl, string gitBinPath, string gvfsHooksRoot)
{
if (string.IsNullOrWhiteSpace(gitBinPath))
{
throw new ArgumentException("Path to git.exe must be set");
}
this.EnlistmentRoot = enlistmentRoot;
this.WorkingDirectoryRoot = workingDirectoryRoot;
this.GitBinPath = gitBinPath;
this.GVFSHooksRoot = gvfsHooksRoot;
this.SetComputedPaths();
GitProcess.Result originResult = new GitProcess(this).GetOriginUrl();
if (originResult.HasErrors)
{
throw new InvalidRepoException("Could not get origin url. git error: " + originResult.Errors);
}
this.RepoUrl = originResult.Output;
this.SetComputedURLs(cacheServerUrl);
}
public string EnlistmentRoot { get; }
public string WorkingDirectoryRoot { get; }
public string DotGitRoot { get; private set; }
public string GitPackRoot { get; private set; }
public string RepoUrl { get; }
public string CacheServerUrl { get; private set; }
public string ObjectsEndpointUrl { get; private set; }
public string PrefetchEndpointUrl { get; private set; }
public string GitBinPath { get; }
public string GVFSHooksRoot { get; }
public static string StripObjectsEndpointSuffix(string input)
{
if (!string.IsNullOrWhiteSpace(input) && input.EndsWith(ObjectsEndpointSuffix))
{
input = input.Substring(0, input.Length - ObjectsEndpointSuffix.Length);
}
return input;
}
protected static string GetCacheConfigSettingName(string repoUrl)
{
string sectionUrl = repoUrl.ToLowerInvariant()
.Replace("https://", string.Empty)
.Replace("http://", string.Empty)
.Replace('/', '.');
return GVFSGitConfigPrefix + sectionUrl + CacheEndpointGitConfigSuffix;
}
protected string GetCacheServerUrlFromConfig(string repoUrl)
{
GitProcess git = new GitProcess(this);
string cacheConfigName = GetCacheConfigSettingName(repoUrl);
string cacheServerUrl = git.GetFromConfig(cacheConfigName);
if (string.IsNullOrWhiteSpace(cacheServerUrl))
{
// Try getting from the deprecated setting for compatibility reasons
cacheServerUrl = StripObjectsEndpointSuffix(git.GetFromConfig(DeprecatedObjectsEndpointGitConfigName));
// Upgrade for future runs, but not at clone time.
if (!string.IsNullOrWhiteSpace(cacheServerUrl) && Directory.Exists(this.WorkingDirectoryRoot))
{
git.SetInLocalConfig(cacheConfigName, cacheServerUrl);
git.DeleteFromLocalConfig(DeprecatedObjectsEndpointGitConfigName);
}
}
// Default to uncached url
if (string.IsNullOrWhiteSpace(cacheServerUrl))
{
return repoUrl;
}
return cacheServerUrl;
}
private void SetComputedPaths()
{
this.DotGitRoot = Path.Combine(this.WorkingDirectoryRoot, GVFSConstants.DotGit.Root);
this.GitPackRoot = Path.Combine(this.WorkingDirectoryRoot, GVFSConstants.DotGit.Objects.Pack.Root);
}
private void SetComputedURLs(string cacheServerUrl)
{
this.CacheServerUrl = !string.IsNullOrWhiteSpace(cacheServerUrl) ? cacheServerUrl : this.GetCacheServerUrlFromConfig(this.RepoUrl);
this.ObjectsEndpointUrl = this.CacheServerUrl + ObjectsEndpointSuffix;
this.PrefetchEndpointUrl = this.CacheServerUrl + PrefetchEndpointSuffix;
}
}
}

Просмотреть файл

@ -0,0 +1,355 @@
using GVFS.Common.Physical.FileSystem;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.ComponentModel;
using System.IO;
using System.Text;
namespace GVFS.Common
{
public class FileBasedLock : IDisposable
{
private const int DefaultStreamWriterBufferSize = 1024; // Copied from: http://referencesource.microsoft.com/#mscorlib/system/io/streamwriter.cs,5516ce201dc06b5f
private const long InvalidFileLength = -1;
private static readonly Encoding UTF8NoBOM = new UTF8Encoding(false, true); // Default encoding used by StreamWriter
private readonly object deleteOnCloseStreamLock = new object();
private readonly PhysicalFileSystem fileSystem;
private readonly string lockPath;
private ITracer tracer;
private FileStream deleteOnCloseStream;
public FileBasedLock(PhysicalFileSystem fileSystem, ITracer tracer, string lockPath, string signature, ExistingLockCleanup existingLockCleanup)
{
this.fileSystem = fileSystem;
this.tracer = tracer;
this.lockPath = lockPath;
this.Signature = signature;
if (existingLockCleanup != ExistingLockCleanup.LeaveExisting)
{
this.CleanupStaleLock(existingLockCleanup);
}
}
public enum ExistingLockCleanup
{
LeaveExisting,
DeleteExisting,
DeleteExistingAndLogSignature
}
public string Signature { get; private set; }
public bool TryAcquireLockAndDeleteOnClose()
{
try
{
lock (this.deleteOnCloseStreamLock)
{
if (this.IsOpen())
{
return true;
}
this.deleteOnCloseStream = (FileStream)this.fileSystem.OpenFileStream(
this.lockPath,
FileMode.CreateNew,
(FileAccess)(NativeMethods.FileAccess.FILE_GENERIC_READ | NativeMethods.FileAccess.FILE_GENERIC_WRITE | NativeMethods.FileAccess.DELETE),
NativeMethods.FileAttributes.FILE_FLAG_DELETE_ON_CLOSE,
FileShare.Read);
// Pass in true for leaveOpen to ensure that lockStream stays open
using (StreamWriter writer = new StreamWriter(
this.deleteOnCloseStream,
UTF8NoBOM,
DefaultStreamWriterBufferSize,
leaveOpen: true))
{
this.WriteSignatureAndMessage(writer, message: null);
}
return true;
}
}
catch (NativeMethods.Win32FileExistsException)
{
this.DisposeStream();
return false;
}
catch (IOException e)
{
EventMetadata metadata = this.CreateLockMetadata("IOException caught while trying to acquire lock", e);
this.tracer.RelatedEvent(EventLevel.Warning, "TryAcquireLockAndDeleteOnClose", metadata);
this.DisposeStream();
return false;
}
catch (Win32Exception e)
{
EventMetadata metadata = this.CreateLockMetadata("Win32Exception caught while trying to acquire lock", e);
this.tracer.RelatedEvent(EventLevel.Warning, "TryAcquireLockAndDeleteOnClose", metadata);
this.DisposeStream();
return false;
}
catch (Exception e)
{
EventMetadata metadata = this.CreateLockMetadata("Unhandled exception caught while trying to acquire lock", e);
this.tracer.RelatedError("TryAcquireLockAndDeleteOnClose", metadata);
this.DisposeStream();
throw;
}
}
public bool TryReleaseLock()
{
if (this.DisposeStream())
{
return true;
}
LockData lockData = this.GetLockDataFromDisk();
if (lockData == null || lockData.Signature != this.Signature)
{
if (lockData == null)
{
throw new LockFileDoesNotExistException(this.lockPath);
}
throw new LockSignatureDoesNotMatchException(this.lockPath, this.Signature, lockData.Signature);
}
try
{
this.fileSystem.DeleteFile(this.lockPath);
}
catch (IOException e)
{
EventMetadata metadata = this.CreateLockMetadata("IOException caught while trying to release lock", e);
this.tracer.RelatedEvent(EventLevel.Warning, "TryReleaseLock", metadata);
return false;
}
return true;
}
public bool IsOpen()
{
return this.deleteOnCloseStream != null;
}
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
protected void Dispose(bool disposing)
{
this.DisposeStream();
}
private LockData GetLockDataFromDisk()
{
if (this.LockFileExists())
{
string existingSignature;
string existingMessage;
this.ReadLockFile(out existingSignature, out existingMessage);
return new LockData(existingSignature, existingMessage);
}
return null;
}
private void ReadLockFile(out string existingSignature, out string lockerMessage)
{
using (Stream fs = this.fileSystem.OpenFileStream(this.lockPath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite | FileShare.Delete))
using (StreamReader reader = new StreamReader(fs, UTF8NoBOM))
{
existingSignature = reader.ReadLine();
lockerMessage = reader.ReadLine();
}
existingSignature = existingSignature ?? string.Empty;
lockerMessage = lockerMessage ?? string.Empty;
}
private bool LockFileExists()
{
return this.fileSystem.FileExists(this.lockPath);
}
private void CleanupStaleLock(ExistingLockCleanup existingLockCleanup)
{
if (!this.LockFileExists())
{
return;
}
if (existingLockCleanup == ExistingLockCleanup.LeaveExisting)
{
throw new ArgumentException("CleanupStaleLock should not be called with LeaveExisting");
}
EventMetadata metadata = this.CreateLockMetadata();
metadata.Add("existingLockCleanup", existingLockCleanup.ToString());
long length = InvalidFileLength;
try
{
FileProperties existingLockProperties = this.fileSystem.GetFileProperties(this.lockPath);
length = existingLockProperties.Length;
}
catch (Exception e)
{
metadata.Add("Exception", "Exception while getting lock file length: " + e.ToString());
this.tracer.RelatedEvent(EventLevel.Warning, "CleanupEmptyLock", metadata);
}
if (length == 0)
{
metadata.Add("Message", "Deleting empty lock file: " + this.lockPath);
this.tracer.RelatedEvent(EventLevel.Warning, "CleanupEmptyLock", metadata);
}
else
{
metadata.Add("Length", length == InvalidFileLength ? "Invalid" : length.ToString());
switch (existingLockCleanup)
{
case ExistingLockCleanup.DeleteExisting:
metadata.Add("Message", "Deleting stale lock file: " + this.lockPath);
this.tracer.RelatedEvent(EventLevel.Informational, "CleanupExistingLock", metadata);
break;
case ExistingLockCleanup.DeleteExistingAndLogSignature:
string existingSignature;
try
{
string dummyLockerMessage;
this.ReadLockFile(out existingSignature, out dummyLockerMessage);
}
catch (Win32Exception e)
{
if (e.ErrorCode == NativeMethods.ERROR_FILE_NOT_FOUND)
{
// File was deleted before we could read its contents
return;
}
throw;
}
if (existingSignature == this.Signature)
{
metadata.Add("Message", "Deleting stale lock file: " + this.lockPath);
this.tracer.RelatedEvent(EventLevel.Informational, "CleanupExistingLock", metadata);
}
else
{
metadata.Add("ExistingLockSignature", existingSignature);
metadata.Add("Message", "Deleting stale lock file: " + this.lockPath + " with mismatched signature");
this.tracer.RelatedEvent(EventLevel.Warning, "CleanupSignatureMismatchLock", metadata);
}
break;
default:
throw new InvalidOperationException("Invalid ExistingLockCleanup");
}
}
this.fileSystem.DeleteFile(this.lockPath);
}
private void WriteSignatureAndMessage(StreamWriter writer, string message)
{
writer.WriteLine(this.Signature);
if (message != null)
{
writer.Write(message);
}
}
private EventMetadata CreateLockMetadata(string message = null, Exception exception = null, bool errorMessage = false)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "FileBasedLock");
metadata.Add("LockPath", this.lockPath);
metadata.Add("Signature", this.Signature);
if (message != null)
{
metadata.Add(errorMessage ? "ErrorMessage" : "Message", message);
}
if (exception != null)
{
metadata.Add("Exception", exception.ToString());
}
return metadata;
}
private bool DisposeStream()
{
lock (this.deleteOnCloseStreamLock)
{
if (this.deleteOnCloseStream != null)
{
this.deleteOnCloseStream.Dispose();
this.deleteOnCloseStream = null;
return true;
}
}
return false;
}
public class LockException : Exception
{
public LockException(string messageFormat, params string[] args)
: base(string.Format(messageFormat, args))
{
}
}
public class LockFileDoesNotExistException : LockException
{
public LockFileDoesNotExistException(string lockPath)
: base("Lock file {0} does not exist", lockPath)
{
}
}
public class LockSignatureDoesNotMatchException : LockException
{
public LockSignatureDoesNotMatchException(string lockPath, string expectedSignature, string actualSignature)
: base(
"Lock file {0} does not contain expected signature '{1}' (existing signature: '{2}')",
lockPath,
expectedSignature,
actualSignature)
{
}
}
public class LockData
{
public LockData(string signature, string message)
{
this.Signature = signature;
this.Message = message;
}
public string Signature { get; }
public string Message { get; }
}
}
}

Просмотреть файл

@ -0,0 +1,179 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Import Project="$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props" Condition="Exists('$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props')" />
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProjectGuid>{374BF1E5-0B2D-4D4A-BD5E-4212299DEF09}</ProjectGuid>
<OutputType>Library</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>GVFS.Common</RootNamespace>
<AssemblyName>GVFS.Common</AssemblyName>
<TargetFrameworkVersion>v4.5.2</TargetFrameworkVersion>
<FileAlignment>512</FileAlignment>
<NuGetPackageImportStamp>
</NuGetPackageImportStamp>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Debug|x64'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>..\..\..\BuildOutput\GVFS.Common\bin\x64\Debug\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\GVFS.Common\obj\x64\Debug\</IntermediateOutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>full</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Release|x64'">
<OutputPath>..\..\..\BuildOutput\GVFS.Common\bin\x64\Release\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\GVFS.Common\obj\x64\Release\</IntermediateOutputPath>
<DefineConstants>TRACE</DefineConstants>
<Optimize>true</Optimize>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>pdbonly</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
</PropertyGroup>
<PropertyGroup>
<GVFSVersion>0.2.173.2</GVFSVersion>
</PropertyGroup>
<ItemGroup>
<Reference Include="Esent.Collections">
<HintPath>..\..\..\packages\Microsoft.Database.Collections.Generic.1.9.4\lib\net40\Esent.Collections.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Esent.Interop">
<HintPath>..\..\..\packages\ManagedEsent.1.9.4\lib\net40\Esent.Interop.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Esent.Isam">
<HintPath>..\..\..\packages\Microsoft.Database.Isam.1.9.4\lib\net40\Esent.Isam.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Microsoft.Diagnostics.Tracing.EventSource, Version=1.1.28.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\Microsoft.Diagnostics.Tracing.EventSource.Redist.1.1.28\lib\net40\Microsoft.Diagnostics.Tracing.EventSource.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Newtonsoft.Json, Version=7.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\Newtonsoft.Json.7.0.1\lib\net45\Newtonsoft.Json.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="System" />
<Reference Include="System.Core" />
<Reference Include="System.Management" />
<Reference Include="System.Runtime.Caching" />
<Reference Include="System.Web" />
<Reference Include="System.Xml.Linq" />
<Reference Include="System.Data.DataSetExtensions" />
<Reference Include="Microsoft.CSharp" />
<Reference Include="System.Data" />
<Reference Include="System.Net.Http" />
<Reference Include="System.Xml" />
</ItemGroup>
<ItemGroup>
<Compile Include="..\..\..\BuildOutput\CommonAssemblyVersion.cs">
<Link>CommonAssemblyVersion.cs</Link>
</Compile>
<Compile Include="BatchedLooseObjects\BatchedLooseObjectDeserializer.cs" />
<Compile Include="BatchedLooseObjects\RestrictedStream.cs" />
<Compile Include="AntiVirusExclusions.cs" />
<Compile Include="GitHelper.cs" />
<Compile Include="Git\GitPathConverter.cs" />
<Compile Include="Git\GVFSConfigResponse.cs" />
<Compile Include="GVFSLock.cs" />
<Compile Include="HeartbeatThread.cs" />
<Compile Include="IBackgroundOperation.cs" />
<Compile Include="InvalidRepoException.cs" />
<Compile Include="MountParameters.cs" />
<Compile Include="Physical\Git\CopyBlobContentTimeoutException.cs" />
<Compile Include="Physical\Git\EndianHelper.cs" />
<Compile Include="Physical\RepoMetadata.cs" />
<Compile Include="PrefetchPacks\PrefetchPacksDeserializer.cs" />
<Compile Include="RetryableException.cs" />
<Compile Include="ReturnCode.cs" />
<Compile Include="StreamUtil.cs" />
<Compile Include="CallbackResult.cs" />
<Compile Include="ConcurrentHashSet.cs" />
<Compile Include="FileBasedLock.cs" />
<Compile Include="ProcessPool.cs" />
<Compile Include="Git\DiffTreeResult.cs" />
<Compile Include="Git\GitCatFileBatchCheckProcess.cs" />
<Compile Include="Git\GitCatFileBatchProcess.cs" />
<Compile Include="Git\GitCatFileProcess.cs" />
<Compile Include="Git\GitProcess.cs" />
<Compile Include="Git\GitRefs.cs" />
<Compile Include="Git\GitTreeEntry.cs" />
<Compile Include="Git\GitVersion.cs" />
<Compile Include="NamedPipes\BrokenPipeException.cs" />
<Compile Include="NamedPipes\NamedPipeClient.cs" />
<Compile Include="NamedPipes\NamedPipeMessages.cs" />
<Compile Include="NamedPipes\NamedPipeServer.cs" />
<Compile Include="Physical\RegistryUtils.cs" />
<Compile Include="ProcessHelper.cs" />
<Compile Include="GVFSConstants.cs" />
<Compile Include="GVFSContext.cs" />
<Compile Include="GVFSEnlistment.cs" />
<Compile Include="Git\HttpGitObjects.cs" />
<Compile Include="Enlistment.cs" />
<Compile Include="Git\GitObjects.cs" />
<Compile Include="NativeMethods.cs" />
<Compile Include="Physical\FileSystem\DirectoryItemInfo.cs" />
<Compile Include="Physical\FileSystem\FileProperties.cs" />
<Compile Include="Physical\FileSystem\PhysicalFileSystem.cs" />
<Compile Include="Physical\FileSystem\StreamReaderExtensions.cs" />
<Compile Include="Physical\Git\BigEndianReader.cs" />
<Compile Include="Physical\Git\GitIndex.cs" />
<Compile Include="Physical\Git\GVFSGitObjects.cs" />
<Compile Include="Physical\Git\GitRepo.cs" />
<Compile Include="ProcessResult.cs" />
<Compile Include="Properties\AssemblyInfo.cs" />
<Compile Include="ReliableBackgroundOperations.cs" />
<Compile Include="RetryWrapper.cs" />
<Compile Include="SHA1Util.cs" />
<Compile Include="TaskExtensions.cs" />
<Compile Include="Git\CatFileTimeoutException.cs" />
<Compile Include="Tracing\ConsoleEventListener.cs" />
<Compile Include="Tracing\EventMetadata.cs" />
<Compile Include="Tracing\InProcEventListener.cs" />
<Compile Include="Tracing\ITracer.cs" />
<Compile Include="Tracing\JsonEtwTracer.cs" />
<Compile Include="Tracing\Keywords.cs" />
<Compile Include="Tracing\LogFileEventListener.cs" />
<Compile Include="WindowsProcessJob.cs" />
</ItemGroup>
<ItemGroup>
<None Include="packages.config" />
</ItemGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
<PropertyGroup>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
</PropertyGroup>
<Error Condition="!Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets'))" />
<Error Condition="!Exists('..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets'))" />
<Error Condition="!Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets'))" />
</Target>
<Import Project="..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" />
<Import Project="..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets" Condition="Exists('..\..\..\packages\Microsoft.Diagnostics.Tracing.EventRegister.1.1.28\build\Microsoft.Diagnostics.Tracing.EventRegister.targets')" />
<PropertyGroup>
<PostBuildEvent>
</PostBuildEvent>
</PropertyGroup>
<PropertyGroup>
<PreBuildEvent>$(SolutionDir)\Scripts\CreateCommonAssemblyVersion.bat $(GVFSVersion) $(SolutionDir)\..</PreBuildEvent>
</PropertyGroup>
<Import Project="..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" />
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
</Project>

Просмотреть файл

@ -0,0 +1,143 @@
using GVFS.Common.Git;
using System.IO;
namespace GVFS.Common
{
public static class GVFSConstants
{
public const int ShaStringLength = 40;
public const string RootFolderPath = "\\";
public const char PathSeparator = '\\';
public const string PathSeparatorString = "\\";
public const char GitPathSeparator = '/';
public const string GitPathSeparatorString = "/";
public const string AppName = "GitVirtualFileSystem";
public const string AppGuid = "9a3cf8bb-ef4b-42df-ac4b-f5f50d114909";
public const string DotGVFSPath = ".gvfs";
public const string GVFSLogFolderName = "logs";
public const string VolumeLabel = "Git Virtual File System";
public const string GVFSConfigEndpointSuffix = "/gvfs/config";
public const string InfoRefsEndpointSuffix = "/info/refs?service=git-upload-pack";
public const string VirtualizeObjectsGitConfigName = "core.virtualizeobjects";
public const string CatFileObjectTypeCommit = "commit";
public const string PrefetchPackPrefix = "prefetch";
public const string HeadCommitName = "HEAD";
public const string GVFSHeadCommitName = "GVFS_HEAD";
public const string AllZeroSha = "0000000000000000000000000000000000000000";
public const string GVFSEtwProviderName = "Microsoft.Git.GVFS";
public const string WorkingDirectoryRootName = "src";
public const string GVFSHooksExecutableName = "GVFS.Hooks.exe";
public const string GVFSReadObjectHookExecutableName = "GVFS.ReadObjectHook.exe";
public const int InvalidProcessId = -1;
public const string GitIsNotInstalledError = "Could not find git.exe. Ensure that Git is installed.";
public static readonly GitVersion MinimumGitVersion = new GitVersion(2, 11, 0, "gvfs", 1, 3);
public static class MediaTypes
{
public const string PrefetchPackFilesAndIndexesMediaType = "application/x-gvfs-timestamped-packfiles-indexes";
public const string LooseObjectMediaType = "application/x-git-loose-object";
public const string CustomLooseObjectsMediaType = "application/x-gvfs-loose-objects";
public const string PackFileMediaType = "application/x-git-packfile";
}
public static class DatabaseNames
{
public const string BackgroundGitUpdates = "BackgroundGitUpdates";
public const string BlobSizes = "BlobSizes";
public const string DoNotProject = "DoNotProject";
public const string RepoMetadata = "RepoMetadata";
}
public static class SpecialGitFiles
{
public const string GitAttributes = ".gitattributes";
public const string GitIgnore = ".gitignore";
}
public static class DotGit
{
public const string Root = ".git";
public const string HeadName = "HEAD";
public const string IndexName = "index";
public const string PackedRefsName = "packed-refs";
public const string LockExtension = ".lock";
public static readonly string Config = Path.Combine(DotGit.Root, "config");
public static readonly string Head = Path.Combine(DotGit.Root, HeadName);
public static readonly string Index = Path.Combine(DotGit.Root, IndexName);
public static readonly string PackedRefs = Path.Combine(DotGit.Root, PackedRefsName);
public static readonly string Shallow = Path.Combine(DotGit.Root, "shallow");
public static class Logs
{
public const string Name = "logs";
public static readonly string HeadName = "HEAD";
public static readonly string Root = Path.Combine(DotGit.Root, Name);
public static readonly string Head = Path.Combine(Logs.Root, Logs.HeadName);
}
public static class Hooks
{
public static readonly string ReadObjectName = "read-object";
public static readonly string Root = Path.Combine(DotGit.Root, "hooks");
public static readonly string PreCommandPath = Path.Combine(Hooks.Root, "pre-command");
public static readonly string ReadObjectPath = Path.Combine(Hooks.Root, ReadObjectName);
}
public static class Info
{
public const string ExcludeName = "exclude";
public static readonly string Root = Path.Combine(DotGit.Root, "info");
public static readonly string SparseCheckoutPath = Path.Combine(Info.Root, "sparse-checkout");
public static readonly string ExcludePath = Path.Combine(Info.Root, ExcludeName);
}
public static class Refs
{
public const string Name = "refs";
public static readonly string Root = Path.Combine(DotGit.Root, Refs.Name);
public static class Heads
{
public const string Name = "heads";
public static readonly string Root = Path.Combine(DotGit.Refs.Root, Heads.Name);
}
}
public static class Objects
{
public const string Name = "objects";
public static readonly string Root = Path.Combine(DotGit.Root, Objects.Name);
public static class Pack
{
public const string Name = "pack";
public static readonly string Root = Path.Combine(Objects.Root, Pack.Name);
}
public static class Info
{
public static readonly string Root = Path.Combine(Objects.Root, "info");
}
}
}
}
}

Просмотреть файл

@ -0,0 +1,45 @@
using GVFS.Common.Physical.FileSystem;
using GVFS.Common.Physical.Git;
using GVFS.Common.Tracing;
using System;
namespace GVFS.Common
{
public class GVFSContext : IDisposable
{
private bool disposedValue = false;
public GVFSContext(ITracer tracer, PhysicalFileSystem fileSystem, GitRepo repository, GVFSEnlistment enlistment)
{
this.Tracer = tracer;
this.FileSystem = fileSystem;
this.Enlistment = enlistment;
this.Repository = repository;
}
public ITracer Tracer { get; private set; }
public PhysicalFileSystem FileSystem { get; private set; }
public GitRepo Repository { get; private set; }
public GVFSEnlistment Enlistment { get; private set; }
public void Dispose()
{
this.Dispose(true);
}
protected virtual void Dispose(bool disposing)
{
if (!this.disposedValue)
{
if (disposing)
{
this.Repository.Dispose();
this.Tracer.Dispose();
this.Tracer = null;
}
this.disposedValue = true;
}
}
}
}

Просмотреть файл

@ -0,0 +1,251 @@
using GVFS.Common.Git;
using System;
using System.IO;
using System.Linq;
using System.Threading;
namespace GVFS.Common
{
public class GVFSEnlistment : Enlistment
{
// New enlistment
public GVFSEnlistment(string enlistmentRoot, string repoUrl, string cacheServerUrl, string gitBinPath, string gvfsHooksRoot)
: base(
enlistmentRoot,
Path.Combine(enlistmentRoot, GVFSConstants.WorkingDirectoryRootName),
repoUrl,
cacheServerUrl,
gitBinPath,
gvfsHooksRoot)
{
this.SetComputedPaths();
// Mutex name cannot include '\' (other than the '\' after Global)
// https://msdn.microsoft.com/en-us/library/windows/desktop/ms682411(v=vs.85).aspx
this.EnlistmentMutex = new Mutex(false, "Global\\" + this.NamedPipeName.Replace('\\', ':'));
}
// Existing, configured enlistment
public GVFSEnlistment(string enlistmentRoot, string cacheServerUrl, string gitBinPath, string gvfsHooksRoot)
: base(
enlistmentRoot,
Path.Combine(enlistmentRoot, GVFSConstants.WorkingDirectoryRootName),
cacheServerUrl,
gitBinPath,
gvfsHooksRoot)
{
this.SetComputedPaths();
// Mutex name cannot include '\' (other than the '\' after Global)
// https://msdn.microsoft.com/en-us/library/windows/desktop/ms682411(v=vs.85).aspx
this.EnlistmentMutex = new Mutex(false, GetMutexName(enlistmentRoot));
}
public Mutex EnlistmentMutex { get; }
public string NamedPipeName { get; private set; }
public string DotGVFSRoot { get; private set; }
public string GVFSLogsRoot { get; private set; }
public string GVFSHeadFile { get; private set; }
public static GVFSEnlistment CreateFromCurrentDirectory(string cacheServerUrl, string gitBinRoot)
{
return CreateFromDirectory(Environment.CurrentDirectory, cacheServerUrl, gitBinRoot, null);
}
public static string GetNamedPipeName(string enlistmentRoot)
{
return string.Format("GVFS_{0}", enlistmentRoot).ToUpper().Replace(':', '_');
}
public static string GetMutexName(string enlistmentRoot)
{
string pipeName = GetNamedPipeName(enlistmentRoot);
return "Global\\" + pipeName.Replace('\\', ':');
}
public static string GetEnlistmentRoot(string directory)
{
directory = directory.TrimEnd(GVFSConstants.PathSeparator);
DirectoryInfo dirInfo;
try
{
dirInfo = new DirectoryInfo(directory);
}
catch (Exception)
{
return null;
}
while (dirInfo != null && dirInfo.Exists)
{
DirectoryInfo[] dotGvfsDirs = dirInfo.GetDirectories(GVFSConstants.DotGVFSPath);
if (dotGvfsDirs.Count() == 1)
{
return dirInfo.FullName;
}
dirInfo = dirInfo.Parent;
}
return null;
}
public static GVFSEnlistment CreateFromDirectory(string directory, string cacheServerUrl, string gitBinRoot, string gvfsHooksRoot)
{
string enlistmentRoot = GetEnlistmentRoot(directory);
if (enlistmentRoot != null)
{
return new GVFSEnlistment(enlistmentRoot, cacheServerUrl, gitBinRoot, gvfsHooksRoot);
}
return null;
}
public static string ToFullPath(string originalValue, string toUseIfOriginalNullOrWhitespace)
{
if (string.IsNullOrWhiteSpace(originalValue))
{
return toUseIfOriginalNullOrWhitespace;
}
try
{
return Path.GetFullPath(originalValue);
}
catch (Exception)
{
return null;
}
}
public static string GetNewGVFSLogFileName(string gvfsLogsRoot, string verb = "")
{
if (!Directory.Exists(gvfsLogsRoot))
{
Directory.CreateDirectory(gvfsLogsRoot);
}
string namePrefix = "gvfs_" + (string.IsNullOrEmpty(verb) ? string.Empty : (verb + "_")) + DateTime.Now.ToString("yyyyMMdd_HHmmss");
string fileName = Path.Combine(
gvfsLogsRoot,
namePrefix + ".log");
if (File.Exists(fileName))
{
fileName = Path.Combine(
gvfsLogsRoot,
namePrefix + "_" + Guid.NewGuid().ToString("N") + ".log");
}
return fileName;
}
public bool TrySetCacheServerUrlConfig()
{
GitProcess git = new Git.GitProcess(this);
string settingName = Enlistment.GetCacheConfigSettingName(this.RepoUrl);
return !git.SetInLocalConfig(settingName, this.CacheServerUrl).HasErrors;
}
public bool TryCreateEnlistmentFolders()
{
try
{
Directory.CreateDirectory(this.EnlistmentRoot);
Directory.CreateDirectory(this.WorkingDirectoryRoot);
this.CreateHiddenDirectory(this.DotGVFSRoot);
}
catch (IOException)
{
return false;
}
return true;
}
public string GetMostRecentGVFSLogFileName()
{
DirectoryInfo logDirectory = new DirectoryInfo(this.GVFSLogsRoot);
if (!logDirectory.Exists)
{
return null;
}
FileInfo[] files = logDirectory.GetFiles();
if (files.Length == 0)
{
return null;
}
return
files
.OrderByDescending(fileInfo => fileInfo.CreationTime)
.First()
.FullName;
}
public bool TryParseGVFSHeadFile(out bool fileExists, out string error, out string commitId)
{
fileExists = false;
error = string.Empty;
commitId = string.Empty;
string gvfsHeadFile = this.GVFSHeadFile;
if (File.Exists(gvfsHeadFile))
{
fileExists = true;
try
{
string gvfsHeadContents = File.ReadAllText(gvfsHeadFile);
GitProcess.Result objectTypeResult = new GitProcess(this).CatFileGetType(gvfsHeadContents);
if (objectTypeResult.HasErrors)
{
error = string.Format("Error while determining the type of the commit stored in {0}: {1}", GVFSConstants.GVFSHeadCommitName, objectTypeResult.Errors);
return false;
}
else if (objectTypeResult.Output.StartsWith(GVFSConstants.CatFileObjectTypeCommit))
{
commitId = gvfsHeadContents;
return true;
}
error = string.Format("Contents of {0}: \"{1}\" is not a commit SHA", GVFSConstants.GVFSHeadCommitName, gvfsHeadContents);
return false;
}
catch (Exception e)
{
error = string.Format("Exception while parsing {0}: {1}", gvfsHeadFile, e.ToString());
return false;
}
}
error = string.Format("File \"{0}\" not found", gvfsHeadFile);
return false;
}
private void SetComputedPaths()
{
this.NamedPipeName = GetNamedPipeName(this.EnlistmentRoot);
this.DotGVFSRoot = Path.Combine(this.EnlistmentRoot, GVFSConstants.DotGVFSPath);
this.GVFSLogsRoot = Path.Combine(this.DotGVFSRoot, GVFSConstants.GVFSLogFolderName);
this.GVFSHeadFile = Path.Combine(this.DotGVFSRoot, GVFSConstants.GVFSHeadCommitName);
}
/// <summary>
/// Creates a hidden directory @ the given path.
/// If directory already exists, hides it.
/// </summary>
/// <param name="path">Path to desired hidden directory</param>
private void CreateHiddenDirectory(string path)
{
DirectoryInfo dir = Directory.CreateDirectory(path);
dir.Attributes = FileAttributes.Hidden;
}
}
}

Просмотреть файл

@ -0,0 +1,216 @@
using System;
using System.Diagnostics;
using GVFS.Common.NamedPipes;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
namespace GVFS.Common
{
public class GVFSLock
{
private readonly object acquisitionLock = new object();
private readonly ITracer tracer;
private NamedPipeMessages.AcquireLock.Data lockHolder;
private bool isHeldInternally;
public GVFSLock(ITracer tracer)
{
this.tracer = tracer;
}
/// <summary>
/// Allows external callers (non-GVFS) to acquire the lock.
/// </summary>
/// <param name="requester">The data for the external acquisition request.</param>
/// <param name="holder">
/// The current holder of the lock if the acquisition fails, or
/// the input request if it succeeds.
/// </param>
/// <returns>True if the lock was acquired, false otherwise.</returns>
public bool TryAcquireLock(
NamedPipeMessages.AcquireLock.Data requester,
out NamedPipeMessages.AcquireLock.Data holder)
{
EventMetadata metadata = new EventMetadata();
EventLevel eventLevel = EventLevel.Verbose;
metadata.Add("LockRequest", requester.ToString());
try
{
lock (this.acquisitionLock)
{
if (this.isHeldInternally)
{
holder = null;
metadata.Add("CurrentLockHolder", "GVFS");
metadata.Add("Result", "Denied");
return false;
}
if (this.IsExternalProcessAlive() &&
this.lockHolder.PID != requester.PID)
{
holder = this.lockHolder;
metadata.Add("CurrentLockHolder", this.lockHolder.ToString());
metadata.Add("Result", "Denied");
return false;
}
metadata.Add("Result", "Accepted");
eventLevel = EventLevel.Informational;
Process process;
if (ProcessHelper.TryGetProcess(requester.PID, out process) &&
string.Equals(requester.OriginalCommand, ProcessHelper.GetCommandLine(process)))
{
this.lockHolder = requester;
holder = requester;
return true;
}
else
{
// Process is no longer running so let it
// succeed since the process non-existence
// signals the lock release.
holder = null;
return true;
}
}
}
finally
{
this.tracer.RelatedEvent(eventLevel, "TryAcquireLockExternal", metadata);
}
}
/// <summary>
/// Allow GVFS to acquire the lock.
/// </summary>
/// <returns>True if GVFS was able to acquire the lock or if it already held it. False othwerwise.</returns>
public bool TryAcquireLock()
{
EventMetadata metadata = new EventMetadata();
EventLevel eventLevel = EventLevel.Verbose;
try
{
lock (this.acquisitionLock)
{
if (this.IsExternalProcessAlive())
{
metadata.Add("CurrentLockHolder", this.lockHolder.ToString());
metadata.Add("Full Command", this.lockHolder.OriginalCommand);
metadata.Add("Result", "Denied");
return false;
}
this.ClearHolder();
this.isHeldInternally = true;
metadata.Add("Result", "Accepted");
eventLevel = EventLevel.Informational;
return true;
}
}
finally
{
this.tracer.RelatedEvent(eventLevel, "TryAcquireLockInternal", metadata);
}
}
/// <summary>
/// Allow GVFS to release the lock if it holds it.
/// </summary>
/// <remarks>
/// This should only be invoked by GVFS and not external callers.
/// Release by external callers is implicit on process termination.
/// </remarks>
public void ReleaseLock()
{
this.tracer.RelatedEvent(EventLevel.Verbose, "ReleaseLock", new EventMetadata());
lock (this.acquisitionLock)
{
this.isHeldInternally = false;
}
}
/// <summary>
/// Returns true if the lock is currently held by an external
/// caller that represents a git call using one of the specified git verbs.
/// </summary>
public bool IsLockedByGitVerb(params string[] verbs)
{
string command = this.GetLockedGitCommand();
if (!string.IsNullOrEmpty(command))
{
return GitHelper.IsVerb(command, verbs);
}
return false;
}
public string GetLockedGitCommand()
{
if (this.IsExternalProcessAlive())
{
return this.lockHolder.ParsedCommand;
}
return null;
}
public string GetStatus()
{
if (this.isHeldInternally)
{
return "Held by GVFS.";
}
string lockedCommand = this.GetLockedGitCommand();
if (!string.IsNullOrEmpty(lockedCommand))
{
return string.Format("Held by {0} (PID:{1})", lockedCommand, this.lockHolder.PID);
}
return "Free";
}
private void ClearHolder()
{
this.lockHolder = null;
}
private bool IsExternalProcessAlive()
{
lock (this.acquisitionLock)
{
if (this.isHeldInternally)
{
if (this.lockHolder != null)
{
throw new InvalidOperationException("Inconsistent GVFSLock state with external holder " + this.lockHolder.ToString());
}
return false;
}
if (this.lockHolder == null)
{
return false;
}
Process process;
if (ProcessHelper.TryGetProcess(this.lockHolder.PID, out process) &&
string.Equals(this.lockHolder.OriginalCommand, ProcessHelper.GetCommandLine(process)))
{
return true;
}
this.ClearHolder();
return false;
}
}
}
}

Просмотреть файл

@ -0,0 +1,8 @@
using System;
namespace GVFS.Common.Git
{
public class CatFileTimeoutException : TimeoutException
{
}
}

Просмотреть файл

@ -0,0 +1,112 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace GVFS.Common.Git
{
public class DiffTreeResult
{
private static readonly HashSet<string> ValidTreeModes = new HashSet<string>() { "040000" };
public enum Operations
{
Unknown,
CopyEdit,
RenameEdit,
Modify,
Delete,
Add,
Unmerged
}
public Operations Operation { get; set; }
public bool SourceIsDirectory { get; set; }
public bool TargetIsDirectory { get; set; }
public string SourceFilename { get; set; }
public string TargetFilename { get; set; }
public string SourceSha { get; set; }
public string TargetSha { get; set; }
public static DiffTreeResult ParseFromDiffTreeLine(string line, string repoRoot)
{
line = line.Substring(1);
// Filenames may contain spaces, but always follow a \t. Other fields are space delimited.
string[] parts = line.Split('\t');
parts = parts[0].Split(' ').Concat(parts.Skip(1)).ToArray();
DiffTreeResult result = new DiffTreeResult();
result.SourceIsDirectory = ValidTreeModes.Contains(parts[0]);
result.TargetIsDirectory = ValidTreeModes.Contains(parts[1]);
result.SourceSha = parts[2];
result.TargetSha = parts[3];
result.Operation = DiffTreeResult.ParseOperation(parts[4]);
result.TargetFilename = ConvertPathToAbsoluteUtf8Path(repoRoot, parts.Last());
result.SourceFilename = parts.Length == 7 ? ConvertPathToAbsoluteUtf8Path(repoRoot, parts[5]) : null;
return result;
}
public static DiffTreeResult ParseFromLsTreeLine(string line, string repoRoot)
{
// Everything from ls-tree is an add.
int treeIndex = line.IndexOf(GitCatFileProcess.TreeMarker);
if (treeIndex >= 0)
{
DiffTreeResult treeAdd = new DiffTreeResult();
treeAdd.TargetIsDirectory = true;
treeAdd.TargetFilename = ConvertPathToAbsoluteUtf8Path(repoRoot, line.Substring(line.LastIndexOf("\t") + 1));
treeAdd.Operation = DiffTreeResult.Operations.Add;
return treeAdd;
}
else
{
int blobIndex = line.IndexOf(GitCatFileProcess.BlobMarker);
if (blobIndex >= 0)
{
DiffTreeResult blobAdd = new DiffTreeResult();
blobAdd.TargetSha = line.Substring(blobIndex + GitCatFileProcess.BlobMarker.Length, GVFSConstants.ShaStringLength);
blobAdd.TargetFilename = ConvertPathToAbsoluteUtf8Path(repoRoot, line.Substring(line.LastIndexOf("\t") + 1));
blobAdd.Operation = DiffTreeResult.Operations.Add;
return blobAdd;
}
else
{
return null;
}
}
}
private static Operations ParseOperation(string gitOperationString)
{
switch (gitOperationString)
{
case "U": return Operations.Unmerged;
case "M": return Operations.Modify;
case "A": return Operations.Add;
case "D": return Operations.Delete;
case "X": return Operations.Unknown;
default:
if (gitOperationString.StartsWith("C"))
{
return Operations.CopyEdit;
}
else if (gitOperationString.StartsWith("R"))
{
return Operations.RenameEdit;
}
throw new InvalidDataException("Unrecognized diff-tree operation: " + gitOperationString);
}
}
private static string ConvertPathToAbsoluteUtf8Path(string repoRoot, string relativePath)
{
return Path.Combine(repoRoot, GitPathConverter.ConvertPathOctetsToUtf8(relativePath.Trim('"')).Replace('/', '\\'));
}
}
}

Просмотреть файл

@ -0,0 +1,16 @@
using System;
using System.Collections.Generic;
namespace GVFS.Common.Git
{
public class GVFSConfigResponse
{
public IEnumerable<VersionRange> AllowedGvfsClientVersions { get; set; }
public class VersionRange
{
public Version Min { get; set; }
public Version Max { get; set; }
}
}
}

Просмотреть файл

@ -0,0 +1,29 @@
using System.IO;
namespace GVFS.Common.Git
{
public class GitCatFileBatchCheckProcess : GitCatFileProcess
{
public GitCatFileBatchCheckProcess(Enlistment enlistment) : base(enlistment, "--batch-check")
{
}
public GitCatFileBatchCheckProcess(StreamReader stdOut, StreamWriter stdIn) : base(stdOut, stdIn)
{
}
public bool TryGetObjectSize(string objectSha, out long size)
{
this.StdIn.Write(objectSha + "\n");
string header;
return this.TryParseSizeFromStdOut(out header, out size);
}
public bool ObjectExists(string objectSha)
{
this.StdIn.Write(objectSha + "\n");
string header = this.StdOut.ReadLine();
return header != null && !header.EndsWith("missing");
}
}
}

Просмотреть файл

@ -0,0 +1,310 @@
using GVFS.Common.Physical.FileSystem;
using GVFS.Common.Physical.Git;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace GVFS.Common.Git
{
public class GitCatFileBatchProcess : GitCatFileProcess
{
private const int BufferSize = 64 * 1024;
private static readonly HashSet<string> ValidBlobModes = new HashSet<string>() { "100644", "100755", "120000" };
public GitCatFileBatchProcess(Enlistment enlistment) : base(enlistment, "--batch")
{
}
public GitCatFileBatchProcess(StreamReader stdOut, StreamWriter stdIn) : base(stdOut, stdIn)
{
}
public IEnumerable<GitTreeEntry> GetTreeEntries(string commitId, string path)
{
IEnumerable<string> foundShas;
if (this.TryGetShasForPath(commitId, path, isFolder: true, shas: out foundShas))
{
HashSet<string> alreadyAdded = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
List<GitTreeEntry> results = new List<GitTreeEntry>();
foreach (string sha in foundShas)
{
foreach (GitTreeEntry entry in this.GetTreeEntries(sha))
{
if (alreadyAdded.Add(entry.Name))
{
results.Add(entry);
}
}
}
return results;
}
return new GitTreeEntry[0];
}
public IEnumerable<GitTreeEntry> GetTreeEntries(string sha)
{
string header;
char[] rawTreeChars;
this.StdIn.Write(sha + "\n");
if (!this.TryReadCatFileBatchOutput(out header, out rawTreeChars)
|| !header.Contains(GitCatFileProcess.TreeMarker))
{
return new GitTreeEntry[0];
}
return this.ParseTree(new string(rawTreeChars));
}
public string GetTreeSha(string commitish)
{
string header;
char[] rawTreeChars;
this.StdIn.Write(commitish + "\n");
if (!this.TryReadCatFileBatchOutput(out header, out rawTreeChars)
|| !header.Contains(GitCatFileProcess.CommitMarker))
{
return null;
}
const string TreeLinePrefix = "tree ";
string commitDetails = new string(rawTreeChars);
string[] detailLines = commitDetails.Split('\n');
if (detailLines.Length < 1 || !detailLines[0].StartsWith(TreeLinePrefix))
{
throw new InvalidDataException("'tree' expected on first line of 'git cat-file'. Actual: " + (detailLines.Length == 0 ? "empty" : detailLines[0]));
}
return detailLines[0].Substring(TreeLinePrefix.Length);
}
public string GetCommitId(string commitish)
{
string header;
char[] rawTreeChars;
this.StdIn.Write(commitish + "\n");
if (!this.TryReadCatFileBatchOutput(out header, out rawTreeChars))
{
return null;
}
int commitMarkerIndex = header.IndexOf(GitCatFileProcess.CommitMarker);
if (commitMarkerIndex < 0)
{
return null;
}
return header.Substring(0, commitMarkerIndex);
}
public bool TryGetFileSha(string commitId, string virtualPath, out string sha)
{
sha = null;
IEnumerable<string> foundShas;
if (this.TryGetShasForPath(commitId, virtualPath, isFolder: false, shas: out foundShas))
{
if (foundShas.Count() > 1)
{
return false;
}
sha = foundShas.Single();
return true;
}
return false;
}
public bool TryCopyBlobContentStream(string blobSha, Action<StreamReader, long> writeAction)
{
string header;
long blobSize;
this.StdIn.Write(blobSha + "\n");
header = this.StdOut.ReadLineAsync().Timeout<string, CopyBlobContentTimeoutException>(GitCatFileProcess.ProcessReadTimeoutMs);
if (!this.TryParseSizeFromCatFileHeader(header, out blobSize))
{
return false;
}
if (!header.Contains(GitCatFileProcess.BlobMarker))
{
// Even if not a blob, be sure to read the remaining bytes (+ 1 for \n) to leave the process in a good state
this.StdOut.CopyBlockTo<CopyBlobContentTimeoutException>(StreamWriter.Null, blobSize + 1);
return false;
}
writeAction(this.StdOut, blobSize);
this.StdOut.CopyBlockTo<CopyBlobContentTimeoutException>(StreamWriter.Null, 1);
return true;
}
public async Task CopyBlobContentStreamAsync(string blobSha, Stream destination)
{
string header;
long blobSize;
await this.StdIn.WriteAsync(blobSha + "\n");
header = await this.StdOut.ReadLineAsync();
if (!this.TryParseSizeFromCatFileHeader(header, out blobSize))
{
throw new InvalidDataException("Invalid cat-file response.");
}
if (!header.Contains(GitCatFileProcess.BlobMarker))
{
// Even if not a blob, be sure to read the remaining bytes (+ 1 for \n) to leave the process in a good state
await this.StdOut.CopyBlockToAsync(StreamWriter.Null, blobSize + 1);
}
using (StreamWriter writer = new StreamWriter(destination, this.StdOut.CurrentEncoding, BufferSize, true))
{
await this.StdOut.CopyBlockToAsync(writer, blobSize);
}
await this.StdOut.CopyBlockToAsync(StreamWriter.Null, 1);
}
private bool TryReadCatFileBatchOutput(out string header, out char[] str)
{
long remainingSize;
header = this.StdOut.ReadLineAsync().Timeout<string, CatFileTimeoutException>(GitCatFileProcess.ProcessReadTimeoutMs);
if (!this.TryParseSizeFromCatFileHeader(header, out remainingSize))
{
str = null;
return false;
}
str = new char[remainingSize + 1]; // Grab trailing \n
this.StdOut.ReadBlockAsync(str, 0, str.Length).Timeout<CatFileTimeoutException>(GitCatFileProcess.ProcessReadTimeoutMs);
return true;
}
private bool TryParseSizeFromCatFileHeader(string header, out long remainingSize)
{
if (header == null || header.EndsWith("missing"))
{
remainingSize = 0;
return false;
}
int spaceIdx = header.LastIndexOf(' ');
if (spaceIdx < 0)
{
throw new InvalidDataException("git cat-file has invalid header " + header);
}
string sizeString = header.Substring(spaceIdx);
if (!long.TryParse(sizeString, out remainingSize) || remainingSize < 0)
{
remainingSize = 0;
return false;
}
return true;
}
private IEnumerable<GitTreeEntry> ParseTree(string rawTreeData)
{
int i = 0;
int len = rawTreeData.Length - 1; // Ingore the trailing \n
while (i < len)
{
int endOfObjMode = rawTreeData.IndexOf(' ', i);
if (endOfObjMode < 0)
{
throw new InvalidDataException("git cat-file content has invalid mode");
}
string objectMode = rawTreeData.Substring(i, endOfObjMode - i);
bool isBlob = ValidBlobModes.Contains(objectMode);
i = endOfObjMode + 1; // +1 to skip space
int endOfObjName = rawTreeData.IndexOf('\0', i);
if (endOfObjName < 0)
{
throw new InvalidDataException("git cat-file content has invalid name");
}
string fileName = Encoding.UTF8.GetString(this.StdOut.CurrentEncoding.GetBytes(rawTreeData.Substring(i, endOfObjName - i)));
i = endOfObjName + 1; // +1 to skip null
byte[] shaBytes = this.StdOut.CurrentEncoding.GetBytes(rawTreeData.Substring(i, 20));
string sha = BitConverter.ToString(shaBytes).Replace("-", string.Empty);
if (sha.Length != GVFSConstants.ShaStringLength)
{
throw new InvalidDataException("git cat-file content has invalid sha: " + sha);
}
i += 20;
yield return new GitTreeEntry(fileName, sha, !isBlob, isBlob);
}
}
/// <summary>
/// We are trying to get the sha of a single path. However, if that is the path of a folder, it can
/// potentially correspond to multiple git trees, and therefore we have to return multiple shas.
///
/// This is due to the fact that git and Windows disagree on case sensitivity. If you add the folders
/// foo and Foo, git will store those as two different trees, but Windows will only ever create a single
/// folder that contains the union of the files inside both trees. In order to enumerate Foo correctly,
/// we have to treat both trees as if they are the same.
///
/// This has one major problem, but Git for Windows has the same issue even with no GVFS in the picture.
/// If you have the files foo\A.txt and Foo\A.txt, after you checkout, git writes both of those files,
/// but whichever one gets written second overwrites the one that was written first, and git status
/// will always report one of them as deleted. In GVFS, we do a case-insensitive union of foo and Foo,
/// so we will end up with the same end result.
/// </summary>
private bool TryGetShasForPath(string commitId, string virtualPath, bool isFolder, out IEnumerable<string> shas)
{
shas = Enumerable.Empty<string>();
string rootTreeSha = this.GetTreeSha(commitId);
if (rootTreeSha == null)
{
return false;
}
List<string> currentLevelShas = new List<string>();
currentLevelShas.Add(rootTreeSha);
string[] pathParts = virtualPath.Split(new char[] { GVFSConstants.PathSeparator }, StringSplitOptions.RemoveEmptyEntries);
for (int i = 0; i < pathParts.Length; ++i)
{
List<string> nextLevelShas = new List<string>();
bool isTree = isFolder || i < pathParts.Length - 1;
foreach (string treeSha in currentLevelShas)
{
IEnumerable<GitTreeEntry> childrenMatchingName =
this.GetTreeEntries(treeSha)
.Where(entry =>
entry.IsTree == isTree &&
string.Equals(pathParts[i], entry.Name, StringComparison.OrdinalIgnoreCase));
foreach (GitTreeEntry childEntry in childrenMatchingName)
{
nextLevelShas.Add(childEntry.Sha);
}
}
if (nextLevelShas.Count == 0)
{
return false;
}
currentLevelShas = nextLevelShas;
}
shas = currentLevelShas;
return true;
}
}
}

Просмотреть файл

@ -0,0 +1,98 @@
using System;
using System.Diagnostics;
using System.IO;
namespace GVFS.Common.Git
{
public abstract class GitCatFileProcess : IDisposable
{
public const string TreeMarker = " tree ";
public const string BlobMarker = " blob ";
public const string CommitMarker = " commit ";
protected const int ProcessReadTimeoutMs = 30000;
protected const int ProcessShutdownTimeoutMs = 2000;
protected readonly StreamReader StdOut;
protected readonly StreamWriter StdIn;
private Process catFileProcess;
private WindowsProcessJob job;
public GitCatFileProcess(Enlistment enlistment, string catFileArgs)
{
// Errors only happen if we use incorrect 'cat-file --batch' parameters. Functional tests will catch these cases.
// This git.exe should not need/use the working directory of the repo.
// Run git.exe in Environment.SystemDirectory to ensure the git.exe process
// does not touch the working directory
// TODO: 851558 Capture and log standard error output
this.catFileProcess = new GitProcess(enlistment).GetGitProcess(
"cat-file " + catFileArgs,
workingDirectory: Environment.SystemDirectory,
dotGitDirectory: enlistment.DotGitRoot,
useReadObjectHook: false,
redirectStandardError: false);
this.catFileProcess.Start();
// We have to use a job to ensure that we can kill the process correctly. The git.exe process that we launch
// immediately creates a child git.exe process, and if we just kill the process we created, the other one gets orphaned.
// By adding our process to a job and closing the job, we guarantee that both processes will exit.
this.job = new WindowsProcessJob(this.catFileProcess);
this.StdIn = this.catFileProcess.StandardInput;
this.StdOut = this.catFileProcess.StandardOutput;
}
public GitCatFileProcess(StreamReader stdOut, StreamWriter stdIn)
{
this.StdIn = stdIn;
this.StdOut = stdOut;
}
public bool IsRunning()
{
return !this.catFileProcess.HasExited;
}
public void Dispose()
{
this.Kill();
}
public void Kill()
{
if (this.job != null)
{
this.job.Dispose();
this.job = null;
}
if (this.catFileProcess != null)
{
this.catFileProcess.Dispose();
this.catFileProcess = null;
}
}
protected bool TryParseSizeFromStdOut(out string header, out long size)
{
// Git always output at least one \n terminated output, so we cannot hang here
header = this.StdOut.ReadLineAsync().Timeout<string, CatFileTimeoutException>(ProcessReadTimeoutMs);
if (header == null || header.EndsWith("missing"))
{
size = 0;
return false;
}
string sizeString = header.Substring(header.LastIndexOf(' '));
if (!long.TryParse(sizeString, out size) || size < 0)
{
throw new InvalidDataException("git cat-file header has invalid size: " + sizeString);
}
return true;
}
}
}

Просмотреть файл

@ -0,0 +1,527 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
namespace GVFS.Common.Git
{
public class GitObjects
{
protected readonly ITracer Tracer;
protected readonly Enlistment Enlistment;
protected readonly HttpGitObjects GitObjectRequestor;
private const string AreaPath = "GitObjects";
public GitObjects(ITracer tracer, Enlistment enlistment, HttpGitObjects httpGitObjects)
{
this.Tracer = tracer;
this.Enlistment = enlistment;
this.GitObjectRequestor = httpGitObjects;
}
public enum DownloadAndSaveObjectResult
{
Success,
ObjectNotOnServer,
Error
}
public virtual bool TryDownloadAndSaveCommits(IEnumerable<string> commitShas, int commitDepth)
{
return this.TryDownloadAndSaveObjects(commitShas, commitDepth, preferLooseObjects: false);
}
public bool TryDownloadAndSaveBlobs(IEnumerable<string> blobShas)
{
return this.TryDownloadAndSaveObjects(blobShas, commitDepth: 1, preferLooseObjects: true);
}
public void DownloadPrefetchPacks(long latestTimestamp)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("latestTimestamp", latestTimestamp);
using (ITracer activity = this.Tracer.StartActivity(nameof(this.DownloadPrefetchPacks), EventLevel.Informational, metadata))
{
RetryWrapper<HttpGitObjects.GitObjectTaskResult>.InvocationResult result = this.GitObjectRequestor.TrySendProtocolRequest(
onSuccess: (tryCount, response) => this.DeserializePrefetchPacks(response, ref latestTimestamp),
onFailure: RetryWrapper<HttpGitObjects.GitObjectTaskResult>.StandardErrorHandler(activity, nameof(this.DownloadPrefetchPacks)),
method: HttpMethod.Get,
endPointGenerator: () => new Uri(
string.Format(
"{0}?lastPackTimestamp={1}",
this.Enlistment.PrefetchEndpointUrl,
latestTimestamp)),
requestBodyGenerator: () => null,
acceptType: new MediaTypeWithQualityHeaderValue(GVFSConstants.MediaTypes.PrefetchPackFilesAndIndexesMediaType));
if (!result.Succeeded)
{
if (result.Result != null && result.Result.HttpStatusCodeResult == HttpStatusCode.NotFound)
{
EventMetadata warning = new EventMetadata();
warning.Add("ErrorMessage", "The server does not support /gvfs/prefetch.");
warning.Add(nameof(this.Enlistment.PrefetchEndpointUrl), this.Enlistment.PrefetchEndpointUrl);
activity.RelatedEvent(EventLevel.Warning, "CommandNotSupported", warning);
}
else
{
EventMetadata error = new EventMetadata();
error.Add("latestTimestamp", latestTimestamp);
error.Add("Exception", result.Error);
error.Add("ErrorMessage", "DownloadPrefetchPacks failed.");
error.Add(nameof(this.Enlistment.PrefetchEndpointUrl), this.Enlistment.PrefetchEndpointUrl);
activity.RelatedError(error);
}
}
}
}
public virtual string WriteLooseObject(string repoRoot, Stream responseStream, string sha, byte[] bufToCopyWith = null)
{
LooseObjectToWrite toWrite = GetLooseObjectDestination(repoRoot, sha);
using (Stream fileStream = OpenTempLooseObjectStream(toWrite.TempFile, async: false))
{
if (bufToCopyWith != null)
{
StreamUtil.CopyToWithBuffer(responseStream, fileStream, bufToCopyWith);
}
else
{
responseStream.CopyTo(fileStream);
}
}
this.FinalizeTempFile(sha, toWrite);
return toWrite.ActualFile;
}
public virtual string WriteTempPackFile(HttpGitObjects.GitEndPointResponseData response)
{
string fileName = Path.GetRandomFileName();
string fullPath = Path.Combine(this.Enlistment.GitPackRoot, fileName);
this.TryWriteNamedPackOrIdx(
tracer: null,
source: response.Stream,
targetFullPath: fullPath,
throwOnError: true);
return fullPath;
}
public virtual bool TryWriteNamedPackOrIdx(
ITracer tracer,
Stream source,
string targetFullPath,
bool throwOnError = false)
{
// It is important to write temp files then rename so that git
// does not mistake a half-written file for an invalid one.
string tempPath = targetFullPath + "temp";
try
{
using (Stream fileStream = File.OpenWrite(tempPath))
{
source.CopyTo(fileStream);
}
this.ValidateTempFile(tempPath, targetFullPath);
File.Move(tempPath, targetFullPath);
}
catch (Exception ex)
{
this.CleanupTempFile(this.Tracer, tempPath);
if (tracer != null)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Exception", ex.ToString());
metadata.Add("ErrorMessage", "Exception caught while writing pack or index");
metadata.Add("TargetFullPath", targetFullPath);
tracer.RelatedError(metadata);
}
if (throwOnError)
{
throw;
}
else
{
return false;
}
}
return true;
}
public virtual GitProcess.Result IndexTempPackFile(string tempPackPath)
{
string packfilePath = GetRandomPackName(this.Enlistment.GitPackRoot);
return this.IndexTempPackFile(tempPackPath, packfilePath);
}
public virtual GitProcess.Result IndexTempPackFile(string tempPackPath, string packfilePath)
{
try
{
File.Move(tempPackPath, packfilePath);
GitProcess.Result result = new GitProcess(this.Enlistment).IndexPack(packfilePath);
if (result.HasErrors)
{
File.Delete(packfilePath);
}
return result;
}
catch (Exception e)
{
if (File.Exists(packfilePath))
{
File.Delete(packfilePath);
}
if (File.Exists(tempPackPath))
{
File.Delete(tempPackPath);
}
return new GitProcess.Result(string.Empty, e.Message, GitProcess.Result.GenericFailureCode);
}
}
public virtual string[] ReadPackFileNames(string prefixFilter = "")
{
return Directory.GetFiles(this.Enlistment.GitPackRoot, prefixFilter + "*.pack");
}
protected virtual DownloadAndSaveObjectResult TryDownloadAndSaveObject(string objectSha)
{
if (objectSha == GVFSConstants.AllZeroSha)
{
return DownloadAndSaveObjectResult.Error;
}
RetryWrapper<HttpGitObjects.GitObjectTaskResult>.InvocationResult output = this.GitObjectRequestor.TryDownloadLooseObject(
objectSha,
onSuccess: (tryCount, response) =>
{
this.WriteLooseObject(this.Enlistment.WorkingDirectoryRoot, response.Stream, objectSha);
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(new HttpGitObjects.GitObjectTaskResult(true));
},
onFailure: this.HandleDownloadAndSaveObjectError);
if (output.Succeeded && output.Result.Success)
{
return DownloadAndSaveObjectResult.Success;
}
if (output.Result != null && output.Result.HttpStatusCodeResult == HttpStatusCode.NotFound)
{
return DownloadAndSaveObjectResult.ObjectNotOnServer;
}
return DownloadAndSaveObjectResult.Error;
}
private static string GetRandomPackName(string packRoot)
{
string packName = "pack-" + Guid.NewGuid().ToString("N") + ".pack";
return Path.Combine(packRoot, packName);
}
private static LooseObjectToWrite GetLooseObjectDestination(string repoRoot, string sha)
{
string firstTwoDigits = sha.Substring(0, 2);
string remainingDigits = sha.Substring(2);
string twoLetterFolderName = Path.Combine(repoRoot, GVFSConstants.DotGit.Objects.Root, firstTwoDigits);
Directory.CreateDirectory(twoLetterFolderName);
return new LooseObjectToWrite(
tempFile: Path.Combine(twoLetterFolderName, Path.GetRandomFileName()),
actualFile: Path.Combine(twoLetterFolderName, remainingDigits));
}
private static FileStream OpenTempLooseObjectStream(string path, bool async)
{
FileOptions options = FileOptions.SequentialScan;
if (async)
{
options |= FileOptions.Asynchronous;
}
return new FileStream(
path,
FileMode.Create,
FileAccess.Write,
FileShare.None,
bufferSize: 4096, // .NET Default
options: options);
}
private bool TryDownloadAndSaveObjects(IEnumerable<string> objectIds, int commitDepth, bool preferLooseObjects)
{
RetryWrapper<HttpGitObjects.GitObjectTaskResult>.InvocationResult output = this.GitObjectRequestor.TryDownloadObjects(
objectIds,
commitDepth,
onSuccess: (tryCount, response) => this.TrySavePackOrLooseObject(objectIds, preferLooseObjects, response),
onFailure: (eArgs) =>
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Operation", "DownloadAndSaveObjects");
metadata.Add("WillRetry", eArgs.WillRetry);
metadata.Add("ErrorMessage", eArgs.Error.ToString());
this.Tracer.RelatedError(metadata, Keywords.Network);
},
preferBatchedLooseObjects: preferLooseObjects);
return output.Succeeded && output.Result.Success;
}
private void HandleDownloadAndSaveObjectError(RetryWrapper<HttpGitObjects.GitObjectTaskResult>.ErrorEventArgs errorArgs)
{
// Silence logging 404's for object downloads. They are far more likely to be git checking for the
// previous existence of a new object than a truly missing object.
HttpGitObjects.HttpGitObjectsException ex = errorArgs.Error as HttpGitObjects.HttpGitObjectsException;
if (ex != null && ex.StatusCode == HttpStatusCode.NotFound)
{
return;
}
RetryWrapper<HttpGitObjects.GitObjectTaskResult>.StandardErrorHandler(this.Tracer, nameof(this.TryDownloadAndSaveObject))(errorArgs);
}
/// <summary>
/// Uses a <see cref="PrefetchPacksDeserializer"/> to read the packs from the stream.
/// </summary>
private RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult DeserializePrefetchPacks(
HttpGitObjects.GitEndPointResponseData response, ref long latestTimestamp)
{
using (ITracer activity = this.Tracer.StartActivity(nameof(this.DeserializePrefetchPacks), EventLevel.Informational))
{
PrefetchPacksDeserializer deserializer = new PrefetchPacksDeserializer(response.Stream);
foreach (PrefetchPacksDeserializer.PackAndIndex pack in deserializer.EnumeratePacks())
{
string packName = string.Format("{0}-{1}-{2}.pack", GVFSConstants.PrefetchPackPrefix, pack.Timestamp, pack.UniqueId);
string packFullPath = Path.Combine(this.Enlistment.GitPackRoot, packName);
string idxName = string.Format("{0}-{1}-{2}.idx", GVFSConstants.PrefetchPackPrefix, pack.Timestamp, pack.UniqueId);
string idxFullPath = Path.Combine(this.Enlistment.GitPackRoot, idxName);
EventMetadata data = new EventMetadata();
data["timestamp"] = pack.Timestamp.ToString();
data["uniqueId"] = pack.UniqueId;
activity.RelatedEvent(EventLevel.Informational, "Receiving Pack/Index", data);
// Write the pack
// If it fails, TryWriteNamedPackOrIdx cleans up the packfile and we retry the prefetch
if (!this.TryWriteNamedPackOrIdx(activity, pack.PackStream, packFullPath))
{
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(null, true);
}
// We will try to build an index if the server does not send one
if (pack.IndexStream == null)
{
if (!this.TryBuildIndex(activity, pack, packFullPath))
{
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(null, true);
}
}
else if (!this.TryWriteNamedPackOrIdx(activity, pack.IndexStream, idxFullPath))
{
// Try to build the index manually, then retry the prefetch
if (this.TryBuildIndex(activity, pack, packFullPath))
{
// If we were able to recreate the failed index
// we can start the prefetch at the next timestamp
latestTimestamp = pack.Timestamp;
}
// The download stream will not be in a good state if the index download fails.
// So we have to restart the prefetch
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(null, true);
}
latestTimestamp = pack.Timestamp;
}
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(
new HttpGitObjects.GitObjectTaskResult(true));
}
}
private bool TryBuildIndex(
ITracer activity,
PrefetchPacksDeserializer.PackAndIndex pack,
string packFullPath)
{
GitProcess.Result result = this.IndexTempPackFile(packFullPath, Path.ChangeExtension(packFullPath, ".pack"));
if (result.HasErrors)
{
// IndexTempPackFile will delete the bad temp pack for us.
EventMetadata errorMetadata = new EventMetadata();
errorMetadata.Add("Operation", "TryBuildIndex");
errorMetadata.Add("pack", packFullPath);
errorMetadata.Add("ErrorMessage", result.Errors);
activity.RelatedError(errorMetadata);
}
return !result.HasErrors;
}
private void CleanupTempFile(ITracer activity, string packRoot, string file)
{
if (file == null)
{
return;
}
string fullPath = Path.Combine(packRoot, file);
this.CleanupTempFile(activity, fullPath);
}
private void CleanupTempFile(ITracer activity, string fullPath)
{
try
{
if (File.Exists(fullPath))
{
File.Delete(fullPath);
}
}
catch (IOException failedDelete)
{
EventMetadata info = new EventMetadata();
info.Add("ErrorMessage", "Exception cleaning up temp file");
info.Add("file", fullPath);
info.Add("Exception", failedDelete.ToString());
activity.RelatedEvent(EventLevel.Warning, "Warning", info);
}
}
private void FinalizeTempFile(string sha, LooseObjectToWrite toWrite)
{
try
{
// Checking for existence reduces warning outputs when a streamed download tries.
if (!File.Exists(toWrite.ActualFile))
{
this.ValidateTempFile(toWrite.TempFile, sha);
File.Move(toWrite.TempFile, toWrite.ActualFile);
}
}
catch (IOException)
{
// IOExceptions happen when someone else is writing to our object.
// That implies they are doing what we're doing, which should be a success
}
finally
{
this.CleanupTempFile(this.Tracer, toWrite.TempFile);
}
}
private void ValidateTempFile(string filePath, string intendedPurpose)
{
FileInfo info = new FileInfo(filePath);
if (info.Length == 0)
{
throw new RetryableException("Temp file for '" + intendedPurpose + "' was written with 0 bytes");
}
else
{
using (Stream fs = info.OpenRead())
{
byte[] buffer = new byte[10];
int bytesRead = fs.Read(buffer, 0, buffer.Length);
if (buffer.Take(bytesRead).All(b => b == 0))
{
throw new RetryableException("Temp file for '" + intendedPurpose + "' was written with " + buffer.Length + " null bytes");
}
}
}
}
private RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult TrySavePackOrLooseObject(IEnumerable<string> objectShas, bool unpackObjects, HttpGitObjects.GitEndPointResponseData responseData)
{
if (responseData.ContentType == HttpGitObjects.ContentType.LooseObject)
{
List<string> objectShaList = objectShas.Distinct().ToList();
if (objectShaList.Count != 1)
{
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(new InvalidOperationException("Received loose object when multiple objects were requested."), shouldRetry: false);
}
this.WriteLooseObject(this.Enlistment.WorkingDirectoryRoot, responseData.Stream, objectShaList[0]);
}
else if (responseData.ContentType == HttpGitObjects.ContentType.BatchedLooseObjects)
{
BatchedLooseObjectDeserializer deserializer = new BatchedLooseObjectDeserializer(
responseData.Stream,
(stream, sha) => this.WriteLooseObject(this.Enlistment.WorkingDirectoryRoot, stream, sha));
deserializer.ProcessObjects();
}
else
{
GitProcess.Result result = this.TryAddPackFile(responseData.Stream, unpackObjects);
if (result.HasErrors)
{
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(new InvalidOperationException("Could not add pack file: " + result.Errors), shouldRetry: false);
}
}
return new RetryWrapper<HttpGitObjects.GitObjectTaskResult>.CallbackResult(new HttpGitObjects.GitObjectTaskResult(true));
}
private GitProcess.Result TryAddPackFile(Stream contents, bool unpackObjects)
{
Debug.Assert(contents != null, "contents should not be null");
GitProcess.Result result;
if (unpackObjects)
{
result = new GitProcess(this.Enlistment).UnpackObjects(contents);
}
else
{
string packfilePath = GetRandomPackName(this.Enlistment.GitPackRoot);
using (FileStream fileStream = File.OpenWrite(packfilePath))
{
contents.CopyTo(fileStream);
}
this.ValidateTempFile(packfilePath, packfilePath);
result = new GitProcess(this.Enlistment).IndexPack(packfilePath);
}
return result;
}
private struct LooseObjectToWrite
{
public readonly string TempFile;
public readonly string ActualFile;
public LooseObjectToWrite(string tempFile, string actualFile)
{
this.TempFile = tempFile;
this.ActualFile = actualFile;
}
}
}
}

Просмотреть файл

@ -0,0 +1,58 @@
using System;
using System.Collections.Generic;
using System.Text;
namespace GVFS.Common.Git
{
public static class GitPathConverter
{
private const int CharsInOctet = 3;
private const char OctetIndicator = '\\';
public static string ConvertPathOctetsToUtf8(string filePath)
{
if (filePath == null)
{
return null;
}
int octetIndicatorIndex = filePath.IndexOf(OctetIndicator);
if (octetIndicatorIndex == -1)
{
return filePath;
}
StringBuilder converted = new StringBuilder();
List<byte> octets = new List<byte>();
int index = 0;
while (octetIndicatorIndex != -1)
{
converted.Append(filePath.Substring(index, octetIndicatorIndex - index));
while (octetIndicatorIndex < filePath.Length && filePath[octetIndicatorIndex] == OctetIndicator)
{
string octet = filePath.Substring(octetIndicatorIndex + 1, CharsInOctet);
octets.Add(Convert.ToByte(octet, 8));
octetIndicatorIndex += CharsInOctet + 1;
}
AddOctetsAsUtf8(converted, octets);
index = octetIndicatorIndex;
octetIndicatorIndex = filePath.IndexOf(OctetIndicator, octetIndicatorIndex);
}
AddOctetsAsUtf8(converted, octets);
converted.Append(filePath.Substring(index));
return converted.ToString();
}
private static void AddOctetsAsUtf8(StringBuilder converted, List<byte> octets)
{
if (octets.Count > 0)
{
converted.Append(Encoding.UTF8.GetChars(octets.ToArray()));
octets.Clear();
}
}
}
}

Просмотреть файл

@ -0,0 +1,517 @@
using GVFS.Common.Physical;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using Microsoft.Win32;
using System;
using System.ComponentModel;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Text;
namespace GVFS.Common.Git
{
public class GitProcess
{
private const string GitProcessName = "git.exe";
private const string GitBinRelativePath = "cmd\\git.exe";
private const string GitInstallationRegistryKey = "SOFTWARE\\GitForWindows";
private const string GitInstallationRegistryInstallPathValue = "InstallPath";
private object executionLock = new object();
private Enlistment enlistment;
public GitProcess(Enlistment enlistment)
{
if (enlistment == null)
{
throw new ArgumentNullException(nameof(enlistment));
}
if (string.IsNullOrWhiteSpace(enlistment.GitBinPath))
{
throw new ArgumentException(nameof(enlistment.GitBinPath));
}
this.enlistment = enlistment;
}
public static bool GitExists(string gitBinPath)
{
if (!string.IsNullOrWhiteSpace(gitBinPath))
{
return File.Exists(gitBinPath);
}
return ProcessHelper.WhereDirectory(GitProcessName) != null;
}
public static string GetInstalledGitBinPath()
{
string gitBinPath = RegistryUtils.GetStringFromRegistry(RegistryHive.LocalMachine, GitInstallationRegistryKey, GitInstallationRegistryInstallPathValue);
if (!string.IsNullOrWhiteSpace(gitBinPath))
{
gitBinPath = Path.Combine(gitBinPath, GitBinRelativePath);
if (File.Exists(gitBinPath))
{
return gitBinPath;
}
}
return null;
}
public static Result Init(Enlistment enlistment)
{
return new GitProcess(enlistment).InvokeGitOutsideEnlistment("init " + enlistment.WorkingDirectoryRoot);
}
public static bool TryGetCredentials(
ITracer tracer,
Enlistment enlistment,
out string username,
out string password)
{
username = null;
password = null;
using (ITracer activity = tracer.StartActivity("TryGetCredentials", EventLevel.Informational))
{
Result gitCredentialOutput = new GitProcess(enlistment).InvokeGitOutsideEnlistment(
"credential fill",
stdin => stdin.Write("url=" + enlistment.RepoUrl + "\n\n"),
parseStdOutLine: null);
if (gitCredentialOutput.HasErrors)
{
EventMetadata errorData = new EventMetadata();
errorData.Add("ErrorMessage", "Git could not get credentials: " + gitCredentialOutput.Errors);
tracer.RelatedError(errorData, Keywords.Network);
return false;
}
username = ParseValue(gitCredentialOutput.Output, "username=");
password = ParseValue(gitCredentialOutput.Output, "password=");
bool success = username != null && password != null;
EventMetadata metadata = new EventMetadata();
metadata.Add("Success", success);
if (!success)
{
metadata.Add("Output", gitCredentialOutput.Output);
}
activity.Stop(metadata);
return success;
}
}
public static void RevokeCredential(Enlistment enlistment)
{
new GitProcess(enlistment).InvokeGitOutsideEnlistment(
"credential reject",
stdin => stdin.Write("url=" + enlistment.RepoUrl + "\n\n"),
null);
}
public static Result Version(Enlistment enlistment)
{
return new GitProcess(enlistment).InvokeGitOutsideEnlistment("--version");
}
public bool IsValidRepo()
{
Result result = this.InvokeGitAgainstDotGitFolder("rev-parse --show-toplevel");
return !result.HasErrors;
}
public Result RevParse(string gitRef)
{
return this.InvokeGitAgainstDotGitFolder("rev-parse " + gitRef);
}
public string GetRepoRoot()
{
Result result = this.InvokeGitAgainstDotGitFolder("rev-parse --show-toplevel");
if (result.HasErrors)
{
throw new InvalidRepoException(result.Errors);
}
return result.Output.TrimEnd('\r', '\n').Replace("/", "\\");
}
public void DeleteFromLocalConfig(string settingName)
{
this.InvokeGitAgainstDotGitFolder("config --local --unset-all " + settingName);
}
public Result SetInLocalConfig(string settingName, string value, bool replaceAll = false)
{
return this.InvokeGitAgainstDotGitFolder(string.Format(
"config --local {0} {1} {2}",
replaceAll ? "--replace-all " : string.Empty,
settingName,
value));
}
public Result GetAllLocalConfig()
{
return this.InvokeGitAgainstDotGitFolder("config --list --local");
}
public string GetFromConfig(string settingName)
{
// This method is called at clone time, so the physical repo may not exist yet.
Result result = Directory.Exists(this.enlistment.WorkingDirectoryRoot) ?
this.InvokeGitAgainstDotGitFolder("config " + settingName) :
this.InvokeGitOutsideEnlistment("config " + settingName);
// Git returns non-zero for non-existent settings and errors.
if (!result.HasErrors)
{
return result.Output.TrimEnd('\n');
}
else if (result.Errors.Any())
{
throw new InvalidRepoException("Error while reading '" + settingName + "' from config: " + result.Errors);
}
return null;
}
public Result GetOriginUrl()
{
Result result = this.InvokeGitAgainstDotGitFolder("remote -v");
if (result.HasErrors)
{
return result;
}
string[] lines = result.Output.Split('\r', '\n');
string originFetchLine = lines.Where(
l => l.StartsWith("origin", StringComparison.OrdinalIgnoreCase)
&& l.EndsWith("(fetch)")).FirstOrDefault();
if (originFetchLine == null)
{
throw new InvalidRepoException("remote 'origin' is not configured for this repo");
}
string[] parts = originFetchLine.Split('\t', ' ');
return new Result(parts[1], string.Empty, 0);
}
public void DiffTree(string sourceTreeish, string targetTreeish, Action<string> onResult)
{
this.InvokeGitAgainstDotGitFolder("diff-tree -r -t " + sourceTreeish + " " + targetTreeish, null, onResult);
}
public Result DiffWithNameOnlyAndFilterForAddedAndReanamedFiles(string commitish1, string commitish2)
{
return this.InvokeGitInWorkingDirectoryRoot("diff --name-only --diff-filter=AR " + commitish1 + " " + commitish2, useReadObjectHook: true);
}
public Result CreateBranchWithUpstream(string branchToCreate, string upstreamBranch)
{
return this.InvokeGitAgainstDotGitFolder("branch --set-upstream " + branchToCreate + " " + upstreamBranch);
}
public Result ForceCheckout(string target)
{
return this.InvokeGitInWorkingDirectoryRoot("checkout -f " + target, useReadObjectHook: false);
}
public Result UpdateIndexVersion4()
{
return this.InvokeGitAgainstDotGitFolder("update-index --index-version 4");
}
public Result UnpackObjects(Stream packFileStream)
{
return this.InvokeGitAgainstDotGitFolder(
"unpack-objects",
stdin =>
{
packFileStream.CopyTo(stdin.BaseStream);
stdin.Write('\n');
},
null);
}
public Result IndexPack(string packfilePath)
{
return this.InvokeGitAgainstDotGitFolder("index-pack \"" + packfilePath + "\"");
}
public Result RemoteAdd(string remoteName, string url)
{
return this.InvokeGitAgainstDotGitFolder("remote add " + remoteName + " " + url);
}
public Result CatFilePretty(string objectId)
{
return this.InvokeGitAgainstDotGitFolder("cat-file -p " + objectId);
}
public Result CatFileGetType(string objectId)
{
return this.InvokeGitAgainstDotGitFolder("cat-file -t " + objectId);
}
public Result CatFileBatchCheckAll(Action<string> parseStdOutLine)
{
return this.InvokeGitAgainstDotGitFolder("cat-file --batch-check --batch-all-objects", null, parseStdOutLine);
}
public Result LsTree(string treeish, Action<string> parseStdOutLine, bool recursive, bool showAllTrees = false)
{
return this.InvokeGitAgainstDotGitFolder(
"ls-tree " + (recursive ? "-r " : string.Empty) + (showAllTrees ? "-t " : string.Empty) + treeish,
null,
parseStdOutLine);
}
public Result SetUpstream(string branchName, string upstream)
{
return this.InvokeGitAgainstDotGitFolder("branch --set-upstream-to=" + upstream + " " + branchName);
}
public Result UpdateBranchSymbolicRef(string refToUpdate, string targetRef)
{
return this.InvokeGitAgainstDotGitFolder("symbolic-ref " + refToUpdate + " " + targetRef);
}
public Result UpdateBranchSha(string refToUpdate, string targetSha)
{
// If oldCommitResult doesn't fail, then the branch exists and update-ref will want the old sha
Result oldCommitResult = this.RevParse(refToUpdate);
string oldSha = string.Empty;
if (!oldCommitResult.HasErrors)
{
oldSha = oldCommitResult.Output.TrimEnd('\n');
}
return this.InvokeGitAgainstDotGitFolder("update-ref --no-deref " + refToUpdate + " " + targetSha + " " + oldSha);
}
public Result ReadTree(string treeIsh)
{
return this.InvokeGitAgainstDotGitFolder("read-tree " + treeIsh);
}
public Process GetGitProcess(string command, string workingDirectory, string dotGitDirectory, bool useReadObjectHook, bool redirectStandardError)
{
ProcessStartInfo processInfo = new ProcessStartInfo(this.enlistment.GitBinPath);
processInfo.WorkingDirectory = workingDirectory;
processInfo.UseShellExecute = false;
processInfo.RedirectStandardInput = true;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = redirectStandardError;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
processInfo.EnvironmentVariables["GIT_TERMINAL_PROMPT"] = "0";
processInfo.EnvironmentVariables["PATH"] =
string.Join(
";",
this.enlistment.GitBinPath,
this.enlistment.GVFSHooksRoot ?? string.Empty);
if (!useReadObjectHook)
{
command = "-c " + GVFSConstants.VirtualizeObjectsGitConfigName + "=false " + command;
}
if (!string.IsNullOrEmpty(dotGitDirectory))
{
command = "--git-dir=\"" + dotGitDirectory + "\" " + command;
}
processInfo.Arguments = command;
Process executingProcess = new Process();
executingProcess.StartInfo = processInfo;
return executingProcess;
}
private static string ParseValue(string contents, string prefix)
{
int startIndex = contents.IndexOf(prefix) + prefix.Length;
if (startIndex >= 0 && startIndex < contents.Length)
{
int endIndex = contents.IndexOf('\n', startIndex);
if (endIndex >= 0 && endIndex < contents.Length)
{
return
contents
.Substring(startIndex, endIndex - startIndex)
.Trim('\r');
}
}
return null;
}
/// <summary>
/// Invokes git.exe without a working directory set.
/// </summary>
/// <remarks>
/// For commands where git doesn't need to be (or can't be) run from inside an enlistment.
/// eg. 'git init' or 'git credential'
/// </remarks>
private Result InvokeGitOutsideEnlistment(string command)
{
return this.InvokeGitOutsideEnlistment(command, null, null);
}
private Result InvokeGitOutsideEnlistment(
string command,
Action<StreamWriter> writeStdIn,
Action<string> parseStdOutLine,
int timeout = -1)
{
return this.InvokeGitImpl(
command,
workingDirectory: Environment.SystemDirectory,
dotGitDirectory: null,
useReadObjectHook: false,
writeStdIn: writeStdIn,
parseStdOutLine: parseStdOutLine,
timeoutMs: timeout);
}
/// <summary>
/// Invokes git.exe from an enlistment's repository root
/// </summary>
private Result InvokeGitInWorkingDirectoryRoot(string command, bool useReadObjectHook)
{
return this.InvokeGitImpl(
command,
workingDirectory: this.enlistment.WorkingDirectoryRoot,
dotGitDirectory: null,
useReadObjectHook: useReadObjectHook,
writeStdIn: null,
parseStdOutLine: null,
timeoutMs: -1);
}
/// <summary>
/// Invokes git.exe against an enlistment's .git folder.
/// This method should be used only with git-commands that ignore the working directory
/// </summary>
private Result InvokeGitAgainstDotGitFolder(string command)
{
return this.InvokeGitAgainstDotGitFolder(command, null, null);
}
private Result InvokeGitAgainstDotGitFolder(
string command,
Action<StreamWriter> writeStdIn,
Action<string> parseStdOutLine)
{
// This git command should not need/use the working directory of the repo.
// Run git.exe in Environment.SystemDirectory to ensure the git.exe process
// does not touch the working directory
return this.InvokeGitImpl(
command,
workingDirectory: Environment.SystemDirectory,
dotGitDirectory: this.enlistment.DotGitRoot,
useReadObjectHook: false,
writeStdIn: writeStdIn,
parseStdOutLine: parseStdOutLine,
timeoutMs: -1);
}
private Result InvokeGitImpl(
string command,
string workingDirectory,
string dotGitDirectory,
bool useReadObjectHook,
Action<StreamWriter> writeStdIn,
Action<string> parseStdOutLine,
int timeoutMs)
{
try
{
// From https://msdn.microsoft.com/en-us/library/system.diagnostics.process.standardoutput.aspx
// To avoid deadlocks, use asynchronous read operations on at least one of the streams.
// Do not perform a synchronous read to the end of both redirected streams.
using (Process executingProcess = this.GetGitProcess(command, workingDirectory, dotGitDirectory, useReadObjectHook, redirectStandardError: true))
{
StringBuilder output = new StringBuilder();
StringBuilder errors = new StringBuilder();
executingProcess.ErrorDataReceived += (sender, args) =>
{
if (args.Data != null)
{
errors.Append(args.Data + "\n");
}
};
executingProcess.OutputDataReceived += (sender, args) =>
{
if (args.Data != null)
{
if (parseStdOutLine != null)
{
parseStdOutLine(args.Data);
}
else
{
output.Append(args.Data + "\n");
}
}
};
lock (this.executionLock)
{
executingProcess.Start();
if (writeStdIn != null)
{
writeStdIn(executingProcess.StandardInput);
}
executingProcess.BeginOutputReadLine();
executingProcess.BeginErrorReadLine();
if (!executingProcess.WaitForExit(timeoutMs))
{
executingProcess.Kill();
return new Result(output.ToString(), "Operation timed out: " + errors.ToString(), Result.GenericFailureCode);
}
}
return new Result(output.ToString(), errors.ToString(), executingProcess.ExitCode);
}
}
catch (Win32Exception e)
{
return new Result(string.Empty, e.Message, Result.GenericFailureCode);
}
}
public class Result
{
public const int SuccessCode = 0;
public const int GenericFailureCode = 1;
public Result(string output, string errors, int returnCode)
{
this.Output = output;
this.Errors = errors;
this.ReturnCode = returnCode;
}
public string Output { get; }
public string Errors { get; }
public int ReturnCode { get; }
public bool HasErrors
{
get { return this.ReturnCode != SuccessCode; }
}
}
}
}

Просмотреть файл

@ -0,0 +1,112 @@
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace GVFS.Common.Git
{
public class GitRefs
{
private const string Head = "HEAD\0";
private const string Master = "master";
private const string HeadRefPrefix = "refs/heads/";
private const string TagsRefPrefix = "refs/tags/";
private const string OriginRemoteRefPrefix = "refs/remotes/origin/";
private Dictionary<string, string> commitsPerRef;
private string remoteHeadCommitId = null;
public GitRefs(IEnumerable<string> infoRefsResponse, string branch)
{
// First 4 characters of a given line are the length of the line and not part of the commit id so
// skip them (https://git-scm.com/book/en/v2/Git-Internals-Transfer-Protocols)
this.commitsPerRef =
infoRefsResponse
.Where(line =>
line.Contains(" " + HeadRefPrefix) ||
(line.Contains(" " + TagsRefPrefix) && !line.Contains("^")))
.Where(line =>
branch == null ||
line.EndsWith(HeadRefPrefix + branch))
.Select(line => line.Split(' '))
.ToDictionary(
line => line[1].Replace(HeadRefPrefix, OriginRemoteRefPrefix),
line => line[0].Substring(4));
string lineWithHeadCommit = infoRefsResponse.FirstOrDefault(line => line.Contains(Head));
if (lineWithHeadCommit != null)
{
string[] tokens = lineWithHeadCommit.Split(' ');
if (tokens.Length >= 2 && tokens[1].StartsWith(Head))
{
// First 8 characters are not part of the commit id so skip them
this.remoteHeadCommitId = tokens[0].Substring(8);
}
}
}
public int Count
{
get { return this.commitsPerRef.Count; }
}
public IEnumerable<string> GetTipCommitIds()
{
return this.commitsPerRef.Values;
}
public string GetDefaultBranch()
{
IEnumerable<KeyValuePair<string, string>> headRefMatches = this.commitsPerRef.Where(reference =>
reference.Value == this.remoteHeadCommitId
&& reference.Key.StartsWith(OriginRemoteRefPrefix));
if (headRefMatches.Count() == 0 || headRefMatches.Count(reference => reference.Key == (OriginRemoteRefPrefix + Master)) > 0)
{
// Default to master if no HEAD or if the commit ID or the dafult branch matches master (this is
// the same behavior as git.exe)
return Master;
}
// If the HEAD commit ID does not match master grab the first branch that matches
string defaultBranch = headRefMatches.First().Key;
if (defaultBranch.Length < OriginRemoteRefPrefix.Length)
{
return Master;
}
return defaultBranch.Substring(OriginRemoteRefPrefix.Length);
}
/// <summary>
/// Checks if the specified branch exists (case sensitive)
/// </summary>
public bool HasBranch(string branch)
{
string branchRef = OriginRemoteRefPrefix + branch;
return this.commitsPerRef.ContainsKey(branchRef);
}
public IEnumerable<KeyValuePair<string, string>> GetBranchRefPairs()
{
return this.commitsPerRef.Select(kvp => new KeyValuePair<string, string>(kvp.Key, kvp.Value));
}
public string ToPackedRefs()
{
StringBuilder sb = new StringBuilder();
const string LF = "\n";
sb.Append("# pack-refs with: peeled fully-peeled" + LF);
foreach (string refName in this.commitsPerRef.Keys.OrderBy(key => key))
{
sb.Append(this.commitsPerRef[refName] + " " + refName + LF);
}
return sb.ToString();
}
}
}

Просмотреть файл

@ -0,0 +1,26 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace GVFS.Common.Git
{
public class GitTreeEntry
{
public GitTreeEntry(string name, string sha, bool isTree, bool isBlob)
{
this.Name = name;
this.Sha = sha;
this.IsTree = isTree;
this.IsBlob = isBlob;
}
public string Name { get; private set; }
public string Sha { get; private set; }
public bool IsTree { get; private set; }
public bool IsBlob { get; private set; }
public long Size { get; set; }
}
}

Просмотреть файл

@ -0,0 +1,130 @@
namespace GVFS.Common.Git
{
public class GitVersion
{
public GitVersion(int major, int minor, int build, string platform, int revision, int minorRevision)
{
this.Major = major;
this.Minor = minor;
this.Build = build;
this.Platform = platform;
this.Revision = revision;
this.MinorRevision = minorRevision;
}
public int Major { get; private set; }
public int Minor { get; private set; }
public string Platform { get; private set; }
public int Build { get; private set; }
public int Revision { get; private set; }
public int MinorRevision { get; private set; }
public static bool TryParse(string input, out GitVersion version)
{
version = null;
int major, minor, build, revision, minorRevision;
if (string.IsNullOrWhiteSpace(input))
{
return false;
}
string[] parsedComponents = input.Split(new char[] { '.' });
int parsedComponentsLength = parsedComponents.Length;
if (parsedComponentsLength < 5)
{
return false;
}
if (!TryParseComponent(parsedComponents[0], out major))
{
return false;
}
if (!TryParseComponent(parsedComponents[1], out minor))
{
return false;
}
if (!TryParseComponent(parsedComponents[2], out build))
{
return false;
}
if (!TryParseComponent(parsedComponents[4], out revision))
{
return false;
}
if (parsedComponentsLength < 6 || !TryParseComponent(parsedComponents[5], out minorRevision))
{
minorRevision = 0;
}
string platform = parsedComponents[3];
version = new GitVersion(major, minor, build, platform, revision, minorRevision);
return true;
}
public bool IsLessThan(GitVersion other)
{
return this.CompareVersionNumbers(other) < 0;
}
public override string ToString()
{
return string.Format("{0}.{1}.{2}.{3}.{4}.{5}", this.Major, this.Minor, this.Build, this.Platform, this.Revision, this.MinorRevision);
}
private static bool TryParseComponent(string component, out int parsedComponent)
{
if (!int.TryParse(component, out parsedComponent))
{
return false;
}
if (parsedComponent < 0)
{
return false;
}
return true;
}
private int CompareVersionNumbers(GitVersion other)
{
if (other == null)
{
return -1;
}
if (this.Major != other.Major)
{
return this.Major.CompareTo(other.Major);
}
if (this.Minor != other.Minor)
{
return this.Minor.CompareTo(other.Minor);
}
if (this.Build != other.Build)
{
return this.Build.CompareTo(other.Build);
}
if (this.Revision != other.Revision)
{
return this.Revision.CompareTo(other.Revision);
}
if (this.MinorRevision != other.MinorRevision)
{
return this.MinorRevision.CompareTo(other.MinorRevision);
}
return 0;
}
}
}

Просмотреть файл

@ -0,0 +1,642 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
namespace GVFS.Common.Git
{
public class HttpGitObjects
{
private const string AreaPath = "HttpGitObjects";
private const int HttpTimeoutMinutes = 10;
private const int DefaultMaxRetries = 5;
private const int AuthorizationBackoffMinutes = 1;
private static readonly MediaTypeWithQualityHeaderValue CustomLooseObjectsHeader
= new MediaTypeWithQualityHeaderValue(GVFSConstants.MediaTypes.CustomLooseObjectsMediaType);
private static HttpClient client;
private readonly ProductInfoHeaderValue userAgentHeader;
private Enlistment enlistment;
private ITracer tracer;
private DateTime authRetryBackoff = DateTime.MinValue;
private bool credentialHasBeenRevoked = false;
private object gitAuthorizationLock = new object();
private string gitAuthorization;
static HttpGitObjects()
{
client = new HttpClient();
client.Timeout = TimeSpan.FromMinutes(HttpTimeoutMinutes);
}
public HttpGitObjects(ITracer tracer, Enlistment enlistment, int maxConnections)
{
this.tracer = tracer;
this.enlistment = enlistment;
this.MaxRetries = DefaultMaxRetries;
ServicePointManager.DefaultConnectionLimit = maxConnections;
this.userAgentHeader = new ProductInfoHeaderValue(ProcessHelper.GetEntryClassName(), ProcessHelper.GetCurrentProcessVersion());
}
public enum ContentType
{
None,
LooseObject,
BatchedLooseObjects,
PackFile
}
public int MaxRetries { get; set; }
public bool TryRefreshCredentials()
{
return this.TryGetCredentials(out this.gitAuthorization);
}
public virtual List<GitObjectSize> QueryForFileSizes(IEnumerable<string> objectIds)
{
string objectIdsJson = ToJsonList(objectIds);
Uri gvfsEndpoint = new Uri(this.enlistment.RepoUrl + "/gvfs/sizes");
EventMetadata metadata = new EventMetadata();
int objectIdCount = objectIds.Count();
if (objectIdCount > 10)
{
metadata.Add("ObjectIdCount", objectIdCount);
}
else
{
metadata.Add("ObjectIdJson", objectIdsJson);
}
this.tracer.RelatedEvent(EventLevel.Informational, "QueryFileSizes", metadata, Keywords.Network);
RetryWrapper<List<GitObjectSize>> retrier = new RetryWrapper<List<GitObjectSize>>(this.MaxRetries);
retrier.OnFailure += RetryWrapper<List<GitObjectSize>>.StandardErrorHandler(this.tracer, "QueryFileSizes");
RetryWrapper<List<GitObjectSize>>.InvocationResult requestTask = retrier.InvokeAsync(
async tryCount =>
{
GitEndPointResponseData response = this.SendRequest(gvfsEndpoint, HttpMethod.Post, objectIdsJson);
if (response.HasErrors)
{
return new RetryWrapper<List<GitObjectSize>>.CallbackResult(response.Error, response.ShouldRetry);
}
using (Stream objectSizesStream = response.Stream)
using (StreamReader reader = new StreamReader(objectSizesStream))
{
string objectSizesString = await reader.ReadToEndAsync();
List<GitObjectSize> objectSizes = JsonConvert.DeserializeObject<List<GitObjectSize>>(objectSizesString);
return new RetryWrapper<List<GitObjectSize>>.CallbackResult(objectSizes);
}
}).Result;
return requestTask.Result ?? new List<GitObjectSize>(0);
}
public GVFSConfigResponse QueryGVFSConfig()
{
Uri gvfsConfigEndpoint;
string gvfsConfigEndpointString = this.enlistment.RepoUrl + GVFSConstants.GVFSConfigEndpointSuffix;
try
{
gvfsConfigEndpoint = new Uri(gvfsConfigEndpointString);
}
catch (UriFormatException e)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Method", nameof(this.QueryGVFSConfig));
metadata.Add("ErrorMessage", e);
metadata.Add("Url", gvfsConfigEndpointString);
this.tracer.RelatedError(metadata, Keywords.Network);
return null;
}
RetryWrapper<GVFSConfigResponse> retrier = new RetryWrapper<GVFSConfigResponse>(this.MaxRetries);
retrier.OnFailure += RetryWrapper<GVFSConfigResponse>.StandardErrorHandler(this.tracer, "QueryGvfsConfig");
RetryWrapper<GVFSConfigResponse>.InvocationResult output = retrier.Invoke(
tryCount =>
{
GitEndPointResponseData response = this.SendRequest(gvfsConfigEndpoint, HttpMethod.Get, null);
if (response.HasErrors)
{
return new RetryWrapper<GVFSConfigResponse>.CallbackResult(response.Error, response.ShouldRetry);
}
using (Stream responseStream = response.Stream)
using (StreamReader reader = new StreamReader(responseStream))
{
try
{
return new RetryWrapper<GVFSConfigResponse>.CallbackResult(
JsonConvert.DeserializeObject<GVFSConfigResponse>(reader.ReadToEnd()));
}
catch (JsonReaderException e)
{
return new RetryWrapper<GVFSConfigResponse>.CallbackResult(e, false);
}
}
});
return output.Result;
}
public virtual GitRefs QueryInfoRefs(string branch)
{
Uri infoRefsEndpoint;
try
{
infoRefsEndpoint = new Uri(this.enlistment.RepoUrl + GVFSConstants.InfoRefsEndpointSuffix);
}
catch (UriFormatException)
{
return null;
}
RetryWrapper<GitRefs> retrier = new RetryWrapper<GitRefs>(this.MaxRetries);
retrier.OnFailure += RetryWrapper<GitRefs>.StandardErrorHandler(this.tracer, "QueryInfoRefs");
RetryWrapper<GitRefs>.InvocationResult output = retrier.Invoke(
tryCount =>
{
GitEndPointResponseData response = this.SendRequest(infoRefsEndpoint, HttpMethod.Get, null);
if (response.HasErrors)
{
return new RetryWrapper<GitRefs>.CallbackResult(response.Error, response.ShouldRetry);
}
using (Stream responseStream = response.Stream)
using (StreamReader reader = new StreamReader(responseStream))
{
List<string> infoRefsResponse = new List<string>();
while (!reader.EndOfStream)
{
infoRefsResponse.Add(reader.ReadLine());
}
return new RetryWrapper<GitRefs>.CallbackResult(new GitRefs(infoRefsResponse, branch));
}
});
return output.Result;
}
/// <summary>
/// Get the <see cref="Uri"/>s to download and store in the pack directory for bootstrapping
/// </summary>
public IList<Uri> TryGetBootstrapPackSources(Uri bootstrapSource, string branchName)
{
IList<Uri> packUris = null;
RetryWrapper<GitObjectTaskResult>.InvocationResult output = this.TrySendProtocolRequest(
onSuccess: (tryCount, response) =>
{
using (StreamReader sr = new StreamReader(response.Stream))
{
packUris = JsonConvert.DeserializeObject<BootstrapResponse>(sr.ReadToEnd()).PackUris;
}
return new RetryWrapper<GitObjectTaskResult>.CallbackResult(new GitObjectTaskResult(true));
},
onFailure: RetryWrapper<GitObjectTaskResult>.StandardErrorHandler(this.tracer, nameof(this.TryGetBootstrapPackSources)),
method: HttpMethod.Post,
endPoint: bootstrapSource,
requestBody: JsonConvert.SerializeObject(new { branchName = branchName }));
return packUris;
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TryDownloadLooseObject(
string objectId,
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("ObjectId", objectId);
this.tracer.RelatedEvent(EventLevel.Informational, "DownloadLooseObject", metadata, Keywords.Network);
return this.TrySendProtocolRequest(
onSuccess,
onFailure,
HttpMethod.Get,
new Uri(this.enlistment.ObjectsEndpointUrl + "/" + objectId));
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TryDownloadObjects(
Func<IEnumerable<string>> objectIdGenerator,
int commitDepth,
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure,
bool preferBatchedLooseObjects)
{
// We pass the query generator in as a function because we don't want the consumer to know about JSON or network retry logic,
// but we still want the consumer to be able to change the query on each retry if we fail during their onSuccess handler.
return this.TrySendProtocolRequest(
onSuccess,
onFailure,
HttpMethod.Post,
new Uri(this.enlistment.ObjectsEndpointUrl),
() => this.ObjectIdsJsonGenerator(objectIdGenerator, commitDepth),
preferBatchedLooseObjects ? CustomLooseObjectsHeader : null);
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TryDownloadObjects(
IEnumerable<string> objectIds,
int commitDepth,
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure,
bool preferBatchedLooseObjects)
{
string objectIdsJson = CreateObjectIdJson(objectIds, commitDepth);
int objectCount = objectIds.Count();
EventMetadata metadata = new EventMetadata();
if (objectCount < 10)
{
metadata.Add("ObjectIds", string.Join(", ", objectIds));
}
else
{
metadata.Add("ObjectIdCount", objectCount);
}
this.tracer.RelatedEvent(EventLevel.Informational, "DownloadObjects", metadata, Keywords.Network);
return this.TrySendProtocolRequest(
onSuccess,
onFailure,
HttpMethod.Post,
new Uri(this.enlistment.ObjectsEndpointUrl),
objectIdsJson,
preferBatchedLooseObjects ? CustomLooseObjectsHeader : null);
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TrySendProtocolRequest(
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure,
HttpMethod method,
Uri endPoint,
string requestBody = null,
MediaTypeWithQualityHeaderValue acceptType = null)
{
return this.TrySendProtocolRequest(
onSuccess,
onFailure,
method,
endPoint,
() => requestBody,
acceptType);
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TrySendProtocolRequest(
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure,
HttpMethod method,
Uri endPoint,
Func<string> requestBodyGenerator,
MediaTypeWithQualityHeaderValue acceptType = null)
{
return this.TrySendProtocolRequest(
onSuccess,
onFailure,
method,
() => endPoint,
requestBodyGenerator,
acceptType);
}
public virtual RetryWrapper<GitObjectTaskResult>.InvocationResult TrySendProtocolRequest(
Func<int, GitEndPointResponseData, RetryWrapper<GitObjectTaskResult>.CallbackResult> onSuccess,
Action<RetryWrapper<GitObjectTaskResult>.ErrorEventArgs> onFailure,
HttpMethod method,
Func<Uri> endPointGenerator,
Func<string> requestBodyGenerator,
MediaTypeWithQualityHeaderValue acceptType = null)
{
RetryWrapper<GitObjectTaskResult> retrier = new RetryWrapper<GitObjectTaskResult>(this.MaxRetries);
if (onFailure != null)
{
retrier.OnFailure += onFailure;
}
return retrier.Invoke(
tryCount =>
{
GitEndPointResponseData response = this.SendRequest(
endPointGenerator(),
method,
requestBodyGenerator(),
acceptType);
if (response.HasErrors)
{
return new RetryWrapper<GitObjectTaskResult>.CallbackResult(response.Error, response.ShouldRetry, new GitObjectTaskResult(response.StatusCode));
}
using (Stream responseStream = response.Stream)
{
return onSuccess(tryCount, response);
}
});
}
private static string ToJsonList(IEnumerable<string> strings)
{
return "[\"" + string.Join("\",\"", strings) + "\"]";
}
private static string CreateObjectIdJson(IEnumerable<string> strings, int commitDepth)
{
return "{\"commitDepth\": " + commitDepth + ", \"objectIds\":" + ToJsonList(strings) + "}";
}
private static bool ShouldRetry(HttpStatusCode statusCode)
{
// Retry timeouts and 5xx errors
int statusInt = (int)statusCode;
if (statusCode == HttpStatusCode.RequestTimeout ||
(statusInt >= 500 && statusInt < 600))
{
return true;
}
return false;
}
private bool TryGetCredentials(out string authString)
{
authString = this.gitAuthorization;
if (authString == null)
{
lock (this.gitAuthorizationLock)
{
if (this.gitAuthorization == null)
{
string gitUsername;
string gitPassword;
bool backingOff = DateTime.Now < this.authRetryBackoff;
if (this.credentialHasBeenRevoked)
{
// Update backoff after an immediate first retry.
this.authRetryBackoff = DateTime.Now.AddMinutes(AuthorizationBackoffMinutes);
}
if (backingOff ||
!GitProcess.TryGetCredentials(this.tracer, this.enlistment, out gitUsername, out gitPassword))
{
authString = null;
return false;
}
this.gitAuthorization = Convert.ToBase64String(Encoding.ASCII.GetBytes(gitUsername + ":" + gitPassword));
}
authString = this.gitAuthorization;
}
}
return true;
}
private GitEndPointResponseData SendRequest(
Uri requestUri,
HttpMethod httpMethod,
string requestContent,
MediaTypeWithQualityHeaderValue acceptType = null)
{
string authString;
if (!this.TryGetCredentials(out authString))
{
string message =
this.authRetryBackoff == DateTime.MinValue
? "Authorization failed."
: "Authorization failed. No retries will be made until: " + this.authRetryBackoff;
return new GitEndPointResponseData(
HttpStatusCode.Unauthorized,
new HttpGitObjectsException(HttpStatusCode.Unauthorized, message),
shouldRetry: false);
}
HttpRequestMessage request = new HttpRequestMessage(httpMethod, requestUri);
request.Headers.UserAgent.Add(this.userAgentHeader);
request.Headers.Authorization = new AuthenticationHeaderValue("Basic", authString);
if (acceptType != null)
{
request.Headers.Accept.Add(acceptType);
}
if (requestContent != null)
{
request.Content = new StringContent(requestContent, Encoding.UTF8, "application/json");
}
try
{
HttpResponseMessage response = client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead).Result;
if (response.StatusCode == HttpStatusCode.OK)
{
string contentType = string.Empty;
IEnumerable<string> values;
if (response.Content.Headers.TryGetValues("Content-Type", out values))
{
contentType = values.First();
}
this.credentialHasBeenRevoked = false;
Stream responseStream = response.Content.ReadAsStreamAsync().Result;
return new GitEndPointResponseData(response.StatusCode, contentType, responseStream);
}
else
{
string errorMessage = response.Content.ReadAsStringAsync().Result;
int statusInt = (int)response.StatusCode;
if (string.IsNullOrWhiteSpace(errorMessage))
{
if (response.StatusCode == HttpStatusCode.Unauthorized)
{
lock (this.gitAuthorizationLock)
{
// Wipe the username and password so we can try recovering if applicable.
this.gitAuthorization = null;
if (!this.credentialHasBeenRevoked)
{
GitProcess.RevokeCredential(this.enlistment);
this.credentialHasBeenRevoked = true;
return new GitEndPointResponseData(
response.StatusCode,
new HttpGitObjectsException(response.StatusCode, "Server returned error code 401 (Unauthorized). Your PAT may be expired."),
shouldRetry: true);
}
else
{
this.authRetryBackoff = DateTime.MaxValue;
return new GitEndPointResponseData(
response.StatusCode,
new HttpGitObjectsException(response.StatusCode, "Server returned error code 401 (Unauthorized) after successfully renewing your PAT. You may not have access to this repo"),
shouldRetry: false);
}
}
}
else
{
errorMessage = string.Format("Server returned error code {0} ({1})", statusInt, response.StatusCode);
}
}
return new GitEndPointResponseData(response.StatusCode, new HttpGitObjectsException(response.StatusCode, errorMessage), ShouldRetry(response.StatusCode));
}
}
catch (TaskCanceledException)
{
string errorMessage = string.Format("Request to {0} timed out", requestUri);
return new GitEndPointResponseData(HttpStatusCode.RequestTimeout, new HttpGitObjectsException(HttpStatusCode.RequestTimeout, errorMessage), shouldRetry: true);
}
catch (WebException ex)
{
return new GitEndPointResponseData(HttpStatusCode.InternalServerError, ex, shouldRetry: true);
}
}
private string ObjectIdsJsonGenerator(Func<IEnumerable<string>> objectIdGenerator, int commitDepth)
{
IEnumerable<string> objectIds = objectIdGenerator();
string objectIdsJson = CreateObjectIdJson(objectIds, commitDepth);
int objectCount = objectIds.Count();
EventMetadata metadata = new EventMetadata();
if (objectCount < 10)
{
metadata.Add("ObjectIds", string.Join(", ", objectIds));
}
else
{
metadata.Add("ObjectIdCount", objectCount);
}
this.tracer.RelatedEvent(EventLevel.Informational, "DownloadObjects", metadata, Keywords.Network);
return objectIdsJson;
}
public class GitObjectSize
{
public readonly string Id;
public readonly long Size;
[JsonConstructor]
public GitObjectSize(string id, long size)
{
this.Id = id;
this.Size = size;
}
}
public class GitObjectTaskResult
{
public GitObjectTaskResult(bool success)
{
this.Success = success;
}
public GitObjectTaskResult(HttpStatusCode statusCode)
: this(statusCode == HttpStatusCode.OK)
{
this.HttpStatusCodeResult = statusCode;
}
public bool Success { get; }
public HttpStatusCode HttpStatusCodeResult { get; }
}
public class GitEndPointResponseData
{
/// <summary>
/// Constructor used when GitEndPointResponseData contains an error response
/// </summary>
public GitEndPointResponseData(HttpStatusCode statusCode, Exception error, bool shouldRetry)
{
this.StatusCode = statusCode;
this.Error = error;
this.ShouldRetry = shouldRetry;
}
/// <summary>
/// Constructor used when GitEndPointResponseData contains a successful response
/// </summary>
public GitEndPointResponseData(HttpStatusCode statusCode, string contentType, Stream responseStream)
: this(statusCode, null, false)
{
this.Stream = responseStream;
this.ContentType = MapContentType(contentType);
}
/// <summary>
/// Stream returned by a successful response. If the response is an error, Stream will be null
/// </summary>
public Stream Stream { get; }
public Exception Error { get; }
public bool ShouldRetry { get; }
public HttpStatusCode StatusCode { get; }
public bool HasErrors
{
get { return this.StatusCode != HttpStatusCode.OK; }
}
public ContentType ContentType { get; }
/// <summary>
/// Convert from a string-based Content-Type to <see cref="HttpGitObjects.ContentType"/>
/// </summary>
private static ContentType MapContentType(string contentType)
{
switch (contentType)
{
case GVFSConstants.MediaTypes.LooseObjectMediaType:
return ContentType.LooseObject;
case GVFSConstants.MediaTypes.CustomLooseObjectsMediaType:
return ContentType.BatchedLooseObjects;
case GVFSConstants.MediaTypes.PackFileMediaType:
return ContentType.PackFile;
default:
return ContentType.None;
}
}
}
public class HttpGitObjectsException : Exception
{
public HttpGitObjectsException(HttpStatusCode statusCode, string ex) : base(ex)
{
this.StatusCode = statusCode;
}
public HttpStatusCode StatusCode { get; }
}
private class BootstrapResponse
{
public IList<Uri> PackUris { get; set; }
}
}
}

Просмотреть файл

@ -0,0 +1,39 @@
using System;
using System.Linq;
namespace GVFS.Common
{
public static class GitHelper
{
/// <summary>
/// Determines whether the given command line represents any of the git verbs passed in.
/// </summary>
/// <param name="commandLine">The git command line.</param>
/// <param name="verbs">A list of verbs (eg. "status" not "git status").</param>
/// <returns>True if the command line represents any of the verbs, false otherwise.</returns>
public static bool IsVerb(string commandLine, params string[] verbs)
{
if (verbs == null || verbs.Length < 1)
{
throw new ArgumentException("At least one verb must be provided.", nameof(verbs));
}
return
verbs.Any(v =>
{
string verbCommand = "git " + v;
return
commandLine == verbCommand ||
commandLine.StartsWith(verbCommand + " ");
});
}
/// <summary>
/// Returns true if the string is length 40 and all valid hex characters
/// </summary>
public static bool IsValidFullSHA(string sha)
{
return sha.Length == 40 && !sha.Any(c => !(c >= '0' && c <= '9') && !(c >= 'a' && c <= 'f') && !(c >= 'A' && c <= 'F'));
}
}
}

Просмотреть файл

@ -0,0 +1,40 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Threading;
namespace GVFS.Common
{
public class HeartbeatThread
{
private static readonly TimeSpan HeartBeatWaitTime = TimeSpan.FromMinutes(15);
private readonly ITracer tracer;
private Timer thread;
private DateTime startTime;
public HeartbeatThread(ITracer tracer)
{
this.tracer = tracer;
}
public void Start()
{
this.startTime = DateTime.Now;
this.thread = new Timer(
this.EmitHeartbeat,
state: null,
dueTime: HeartBeatWaitTime,
period: HeartBeatWaitTime);
}
private void EmitHeartbeat(object unusedState)
{
EventMetadata metadata = new Tracing.EventMetadata();
metadata.Add("MinutesUptime", (long)(DateTime.Now - this.startTime).TotalMinutes);
metadata.Add("MinutesSinceLast", (int)HeartBeatWaitTime.TotalMinutes);
this.tracer.RelatedEvent(EventLevel.Verbose, "Heartbeat", metadata);
}
}
}

Просмотреть файл

@ -0,0 +1,9 @@
using System;
namespace GVFS.Common
{
public interface IBackgroundOperation
{
Guid Id { get; set; }
}
}

Просмотреть файл

@ -0,0 +1,12 @@
using System;
namespace GVFS.Common
{
public class InvalidRepoException : Exception
{
public InvalidRepoException(string message)
: base(message)
{
}
}
}

Просмотреть файл

@ -0,0 +1,12 @@
namespace GVFS.Common
{
public static class MountParameters
{
public const string Verbosity = "verbosity";
public const string Keywords = "keywords";
public const string DebugWindow = "debug-window";
public const string DefaultVerbosity = "Informational";
public const string DefaultKeywords = "Any";
}
}

Просмотреть файл

@ -0,0 +1,13 @@
using System;
using System.IO;
namespace GVFS.Common.NamedPipes
{
public class BrokenPipeException : Exception
{
public BrokenPipeException(string message, IOException innerException)
: base(message, innerException)
{
}
}
}

Просмотреть файл

@ -0,0 +1,107 @@
using System;
using System.IO;
using System.IO.Pipes;
namespace GVFS.Common.NamedPipes
{
public class NamedPipeClient : IDisposable
{
private string pipeName;
private NamedPipeClientStream clientStream;
private StreamReader reader;
private StreamWriter writer;
public NamedPipeClient(string pipeName)
{
this.pipeName = pipeName;
}
public bool Connect(int timeoutMilliseconds = 3000)
{
if (this.clientStream != null)
{
throw new InvalidOperationException();
}
try
{
this.clientStream = new NamedPipeClientStream(this.pipeName);
this.clientStream.Connect(timeoutMilliseconds);
}
catch (TimeoutException)
{
return false;
}
catch (IOException)
{
return false;
}
this.reader = new StreamReader(this.clientStream);
this.writer = new StreamWriter(this.clientStream);
return true;
}
public void SendRequest(NamedPipeMessages.Message message)
{
this.SendRequest(message.ToString());
}
public void SendRequest(string message)
{
this.ValidateConnection();
try
{
this.writer.WriteLine(message);
this.writer.Flush();
}
catch (IOException e)
{
throw new BrokenPipeException("Unable to send: " + message, e);
}
}
public string ReadRawResponse()
{
try
{
string response = this.reader.ReadLine();
if (response == null)
{
throw new BrokenPipeException("Unable to read from pipe", null);
}
return response;
}
catch (IOException e)
{
throw new BrokenPipeException("Unable to read from pipe", e);
}
}
public NamedPipeMessages.Message ReadResponse()
{
return NamedPipeMessages.Message.FromString(this.ReadRawResponse());
}
public void Dispose()
{
this.ValidateConnection();
this.clientStream.Dispose();
this.clientStream = null;
this.reader = null;
this.writer = null;
}
private void ValidateConnection()
{
if (this.clientStream == null)
{
throw new InvalidOperationException("There is no connection");
}
}
}
}

Просмотреть файл

@ -0,0 +1,242 @@
using Newtonsoft.Json;
using System;
namespace GVFS.Common.NamedPipes
{
public static class NamedPipeMessages
{
public const string UnknownRequest = "UnknownRequest";
public const string UnknownGVFSState = "UnknownGVFSState";
private const char MessageSeparator = '|';
public static class GetStatus
{
public const string Request = "GetStatus";
public const string Mounting = "Mounting";
public const string Ready = "Ready";
public const string Unmounting = "Unmounting";
public const string MountFailed = "MountFailed";
public class Response
{
public string MountStatus { get; set; }
public string EnlistmentRoot { get; set; }
public string RepoUrl { get; set; }
public string ObjectsUrl { get; set; }
public int BackgroundOperationCount { get; set; }
public string LockStatus { get; set; }
public int DiskLayoutVersion { get; set; }
public static Response FromJson(string json)
{
return JsonConvert.DeserializeObject<Response>(json);
}
public string ToJson()
{
return JsonConvert.SerializeObject(this);
}
}
}
public static class Unmount
{
public const string Request = "Unmount";
public const string NotMounted = "NotMounted";
public const string Acknowledged = "Ack";
public const string Completed = "Complete";
public const string AlreadyUnmounting = "AlreadyUnmounting";
public const string MountFailed = "MountFailed";
}
public static class DownloadObject
{
public const string DownloadRequest = "DLO";
public const string SuccessResult = "S";
public const string DownloadFailed = "F";
public const string InvalidSHAResult = "InvalidSHA";
public const string MountNotReadyResult = "MountNotReady";
public class Request
{
public Request(Message message)
{
this.RequestSha = message.Body;
}
public string RequestSha { get; }
public Message CreateMessage()
{
return new Message(DownloadRequest, this.RequestSha);
}
}
public class Response
{
public Response(string result)
{
this.Result = result;
}
public string Result { get; }
public Message CreateMessage()
{
return new Message(this.Result, null);
}
}
}
public static class AcquireLock
{
public const string AcquireRequest = "AcquireLock";
public const string DenyGVFSResult = "LockDeniedGVFS";
public const string DenyGitResult = "LockDeniedGit";
public const string AcceptResult = "LockAcquired";
public const string MountNotReadyResult = "MountNotReady";
public class Request
{
public Request(int pid, string parsedCommand, string originalCommand)
{
this.RequestData = new Data(pid, parsedCommand, originalCommand);
}
public Request(Message message)
{
this.RequestData = message.DeserializeBody<Data>();
}
public Data RequestData { get; }
public Message CreateMessage()
{
return new Message(AcquireRequest, this.RequestData);
}
}
public class Response
{
public Response(string result, Data responseData = null)
{
this.Result = result;
this.ResponseData = responseData;
}
public Response(Message message)
{
this.Result = message.Header;
this.ResponseData = message.DeserializeBody<Data>();
}
public string Result { get; }
public Data ResponseData { get; }
public Message CreateMessage()
{
return new Message(this.Result, this.ResponseData);
}
}
public class Data
{
public Data(int pid, string parsedCommand, string originalCommand)
{
this.PID = pid;
this.ParsedCommand = parsedCommand;
this.OriginalCommand = originalCommand;
}
public int PID { get; set; }
/// <summary>
/// The command line requesting the lock, built internally for parsing purposes.
/// e.g. "git status", "git rebase"
/// </summary>
public string ParsedCommand { get; set; }
/// <summary>
/// The command line for the process requesting the lock, as kept by the OS.
/// e.g. "c:\bin\git\git.exe git-rebase origin/master"
/// </summary>
public string OriginalCommand { get; set; }
public override string ToString()
{
return this.ParsedCommand + " (" + this.PID + ")";
}
}
}
public class Message
{
public Message(string header, object body)
: this(header, JsonConvert.SerializeObject(body))
{
}
private Message(string header, string body)
{
this.Header = header;
this.Body = body;
}
public string Header { get; }
public string Body { get; }
public static Message FromString(string message)
{
string header = null;
string body = null;
if (!string.IsNullOrEmpty(message))
{
string[] parts = message.Split(new[] { NamedPipeMessages.MessageSeparator }, count: 2);
header = parts[0];
if (parts.Length > 1)
{
body = parts[1];
}
}
return new Message(header, body);
}
public TBody DeserializeBody<TBody>()
{
if (string.IsNullOrEmpty(this.Body))
{
return default(TBody);
}
try
{
return JsonConvert.DeserializeObject<TBody>(this.Body);
}
catch (JsonException jsonException)
{
throw new ArgumentException("Unrecognized body contents.", nameof(this.Body), jsonException);
}
}
public override string ToString()
{
string result = string.Empty;
if (!string.IsNullOrEmpty(this.Header))
{
result = this.Header;
}
if (this.Body != null)
{
result = result + NamedPipeMessages.MessageSeparator + this.Body;
}
return result;
}
}
}
}

Просмотреть файл

@ -0,0 +1,117 @@
using System;
using System.IO;
using System.IO.Pipes;
using System.Security.AccessControl;
using System.Threading;
namespace GVFS.Common.NamedPipes
{
public class NamedPipeServer
{
private string pipeName;
private Action<Connection> handleConnection;
public NamedPipeServer(string pipeName, Action<Connection> handleConnection)
{
this.pipeName = pipeName;
this.handleConnection = handleConnection;
}
public void Start()
{
this.CreateNewListenerThread();
}
private void CreateNewListenerThread()
{
new Thread(this.ListenForNewConnection).Start();
}
private void ListenForNewConnection()
{
PipeSecurity security = new PipeSecurity();
security.AddAccessRule(new PipeAccessRule("Users", PipeAccessRights.ReadWrite | PipeAccessRights.CreateNewInstance, AccessControlType.Allow));
security.AddAccessRule(new PipeAccessRule("CREATOR OWNER", PipeAccessRights.FullControl, AccessControlType.Allow));
security.AddAccessRule(new PipeAccessRule("SYSTEM", PipeAccessRights.FullControl, AccessControlType.Allow));
NamedPipeServerStream serverStream = new NamedPipeServerStream(
this.pipeName,
PipeDirection.InOut,
NamedPipeServerStream.MaxAllowedServerInstances,
PipeTransmissionMode.Byte,
PipeOptions.WriteThrough,
0, // default inBufferSize
0, // default outBufferSize
security,
HandleInheritability.None);
serverStream.WaitForConnection();
this.CreateNewListenerThread();
using (Connection connection = new Connection(serverStream))
{
this.handleConnection(connection);
}
}
public class Connection : IDisposable
{
private NamedPipeServerStream serverStream;
private StreamReader reader;
private StreamWriter writer;
public Connection(NamedPipeServerStream serverStream)
{
this.serverStream = serverStream;
this.reader = new StreamReader(this.serverStream);
this.writer = new StreamWriter(this.serverStream);
}
public bool IsConnected
{
get { return this.serverStream.IsConnected; }
}
public string ReadRequest()
{
try
{
return this.reader.ReadLine();
}
catch (IOException)
{
return null;
}
}
public bool TrySendResponse(string message)
{
try
{
this.writer.WriteLine(message);
this.writer.Flush();
return true;
}
catch (IOException)
{
return false;
}
}
public bool TrySendResponse(NamedPipeMessages.Message message)
{
return this.TrySendResponse(message.ToString());
}
public void Dispose()
{
this.serverStream.Dispose();
this.serverStream = null;
this.reader = null;
this.writer = null;
}
}
}
}

Просмотреть файл

@ -0,0 +1,158 @@
using Microsoft.Win32.SafeHandles;
using System;
using System.ComponentModel;
using System.IO;
using System.Linq;
using System.Runtime.InteropServices;
namespace GVFS.Common
{
public static class NativeMethods
{
public const int ERROR_FILE_NOT_FOUND = 2;
public const int ERROR_FILE_EXISTS = 80;
public enum FileAttributes : uint
{
FILE_ATTRIBUTE_READONLY = 1,
FILE_ATTRIBUTE_HIDDEN = 2,
FILE_ATTRIBUTE_SYSTEM = 4,
FILE_ATTRIBUTE_DIRECTORY = 16,
FILE_ATTRIBUTE_ARCHIVE = 32,
FILE_ATTRIBUTE_DEVICE = 64,
FILE_ATTRIBUTE_NORMAL = 128,
FILE_ATTRIBUTE_TEMPORARY = 256,
FILE_ATTRIBUTE_SPARSEFILE = 512,
FILE_ATTRIBUTE_REPARSEPOINT = 1024,
FILE_ATTRIBUTE_COMPRESSED = 2048,
FILE_ATTRIBUTE_OFFLINE = 4096,
FILE_ATTRIBUTE_NOT_CONTENT_INDEXED = 8192,
FILE_ATTRIBUTE_ENCRYPTED = 16384,
FILE_FLAG_FIRST_PIPE_INSTANCE = 524288,
FILE_FLAG_OPEN_NO_RECALL = 1048576,
FILE_FLAG_OPEN_REPARSE_POINT = 2097152,
FILE_FLAG_POSIX_SEMANTICS = 16777216,
FILE_FLAG_BACKUP_SEMANTICS = 33554432,
FILE_FLAG_DELETE_ON_CLOSE = 67108864,
FILE_FLAG_SEQUENTIAL_SCAN = 134217728,
FILE_FLAG_RANDOM_ACCESS = 268435456,
FILE_FLAG_NO_BUFFERING = 536870912,
FILE_FLAG_OVERLAPPED = 1073741824,
FILE_FLAG_WRITE_THROUGH = 2147483648
}
public enum FileAccess : uint
{
FILE_READ_DATA = 1,
FILE_LIST_DIRECTORY = 1,
FILE_WRITE_DATA = 2,
FILE_ADD_FILE = 2,
FILE_APPEND_DATA = 4,
FILE_ADD_SUBDIRECTORY = 4,
FILE_CREATE_PIPE_INSTANCE = 4,
FILE_READ_EA = 8,
FILE_WRITE_EA = 16,
FILE_EXECUTE = 32,
FILE_TRAVERSE = 32,
FILE_DELETE_CHILD = 64,
FILE_READ_ATTRIBUTES = 128,
FILE_WRITE_ATTRIBUTES = 256,
SPECIFIC_RIGHTS_ALL = 65535,
DELETE = 65536,
READ_CONTROL = 131072,
STANDARD_RIGHTS_READ = 131072,
STANDARD_RIGHTS_WRITE = 131072,
STANDARD_RIGHTS_EXECUTE = 131072,
WRITE_DAC = 262144,
WRITE_OWNER = 524288,
STANDARD_RIGHTS_REQUIRED = 983040,
SYNCHRONIZE = 1048576,
FILE_GENERIC_READ = 1179785,
FILE_GENERIC_EXECUTE = 1179808,
FILE_GENERIC_WRITE = 1179926,
STANDARD_RIGHTS_ALL = 2031616,
FILE_ALL_ACCESS = 2032127,
ACCESS_SYSTEM_SECURITY = 16777216,
MAXIMUM_ALLOWED = 33554432,
GENERIC_ALL = 268435456,
GENERIC_EXECUTE = 536870912,
GENERIC_WRITE = 1073741824,
GENERIC_READ = 2147483648
}
public static SafeFileHandle OpenFile(
string filePath,
FileMode fileMode,
FileAccess fileAccess,
FileShare fileShare,
FileAttributes fileAttributes)
{
SafeFileHandle output = CreateFile(filePath, fileAccess, fileShare, IntPtr.Zero, fileMode, fileAttributes | FileAttributes.FILE_FLAG_OVERLAPPED, IntPtr.Zero);
if (output.IsInvalid)
{
ThrowWin32Exception(Marshal.GetLastWin32Error());
}
return output;
}
/// <summary>
/// Lock specified directory, so it can't be deleted or renamed by any other process
/// The trick is to open a handle on the directory (FILE_FLAG_BACKUP_SEMANTICS | FILE_FLAG_OPEN_REPARSE_POINT)
/// and keep it. If it is a junction the second option is required, and if it is a standard directory it is ignored.
/// Caller must call Close() or Dispose() on the returned safe handle to release the lock
/// </summary>
/// <param name="path">Path to existing directory junction</param>
/// <returns>SafeFileHandle</returns>
public static SafeFileHandle LockDirectory(string path)
{
SafeFileHandle result = CreateFile(
path,
FileAccess.GENERIC_READ,
FileShare.Read,
IntPtr.Zero,
FileMode.Open,
FileAttributes.FILE_FLAG_BACKUP_SEMANTICS | FileAttributes.FILE_FLAG_OPEN_REPARSE_POINT,
IntPtr.Zero);
if (result.IsInvalid)
{
ThrowWin32Exception(Marshal.GetLastWin32Error());
}
return result;
}
public static void ThrowWin32Exception(int error, params int[] ignoreErrors)
{
if (ignoreErrors.Any(ignored => ignored == error))
{
return;
}
if (error == ERROR_FILE_EXISTS)
{
throw new Win32FileExistsException();
}
throw new Win32Exception(error);
}
[DllImport("kernel32.dll", SetLastError = true, CharSet = CharSet.Unicode)]
private static extern SafeFileHandle CreateFile(
[In] string lpFileName,
[MarshalAs(UnmanagedType.U4)] FileAccess dwDesiredAccess,
FileShare dwShareMode,
[In] IntPtr lpSecurityAttributes,
[MarshalAs(UnmanagedType.U4)]FileMode dwCreationDisposition,
[MarshalAs(UnmanagedType.U4)]FileAttributes dwFlagsAndAttributes,
[In] IntPtr hTemplateFile);
public class Win32FileExistsException : Win32Exception
{
public Win32FileExistsException()
: base(NativeMethods.ERROR_FILE_EXISTS)
{
}
}
}
}

Просмотреть файл

@ -0,0 +1,10 @@
namespace GVFS.Common.Physical.FileSystem
{
public class DirectoryItemInfo
{
public string Name { get; set; }
public string FullName { get; set; }
public long Length { get; set; }
public bool IsDirectory { get; set; }
}
}

Просмотреть файл

@ -0,0 +1,26 @@
using System;
using System.IO;
namespace GVFS.Common.Physical.FileSystem
{
public class FileProperties
{
public static readonly FileProperties DefaultFile = new FileProperties(FileAttributes.Normal, DateTime.MinValue, DateTime.MinValue, DateTime.MinValue, 0);
public static readonly FileProperties DefaultDirectory = new FileProperties(FileAttributes.Directory, DateTime.MinValue, DateTime.MinValue, DateTime.MinValue, 0);
public FileProperties(FileAttributes attributes, DateTime creationTimeUTC, DateTime lastAccessTimeUTC, DateTime lastWriteTimeUTC, long length)
{
this.FileAttributes = attributes;
this.CreationTimeUTC = creationTimeUTC;
this.LastAccessTimeUTC = lastAccessTimeUTC;
this.LastWriteTimeUTC = lastWriteTimeUTC;
this.Length = length;
}
public FileAttributes FileAttributes { get; private set; }
public DateTime CreationTimeUTC { get; private set; }
public DateTime LastAccessTimeUTC { get; private set; }
public DateTime LastWriteTimeUTC { get; private set; }
public long Length { get; private set; }
}
}

Просмотреть файл

@ -0,0 +1,169 @@
using Microsoft.Win32.SafeHandles;
using System;
using System.Collections.Generic;
using System.IO;
using System.Runtime.InteropServices;
namespace GVFS.Common.Physical.FileSystem
{
public class PhysicalFileSystem
{
public const int DefaultStreamBufferSize = 8192;
// https://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.internalbuffersize(v=vs.110).aspx:
// Max FileSystemWatcher.InternalBufferSize is 64 KB
private const int WatcherBufferSize = 64 * 1024;
public static void RecursiveDelete(string path)
{
DirectoryInfo directory = new DirectoryInfo(path);
foreach (FileInfo file in directory.GetFiles())
{
file.Attributes = FileAttributes.Normal;
file.Delete();
}
foreach (DirectoryInfo subDirectory in directory.GetDirectories())
{
RecursiveDelete(subDirectory.FullName);
}
directory.Delete();
}
public virtual bool FileExists(string path)
{
return File.Exists(path);
}
public virtual void DeleteFile(string path)
{
File.Delete(path);
}
public virtual string ReadAllText(string path)
{
return File.ReadAllText(path);
}
public virtual IEnumerable<string> ReadLines(string path)
{
return File.ReadLines(path);
}
public virtual void WriteAllText(string path, string contents)
{
File.WriteAllText(path, contents);
}
public virtual Stream OpenFileStream(string path, FileMode fileMode, FileAccess fileAccess, FileShare shareMode)
{
return this.OpenFileStream(path, fileMode, fileAccess, NativeMethods.FileAttributes.FILE_ATTRIBUTE_NORMAL, shareMode);
}
public virtual Stream OpenFileStream(string path, FileMode fileMode, FileAccess fileAccess, NativeMethods.FileAttributes attributes, FileShare shareMode)
{
FileAccess access = fileAccess & FileAccess.ReadWrite;
return new FileStream((SafeFileHandle)this.OpenFile(path, fileMode, fileAccess, (FileAttributes)attributes, shareMode), access, DefaultStreamBufferSize, true);
}
public virtual SafeHandle OpenFile(string path, FileMode fileMode, FileAccess fileAccess, FileAttributes attributes, FileShare shareMode)
{
return NativeMethods.OpenFile(path, fileMode, (NativeMethods.FileAccess)fileAccess, shareMode, (NativeMethods.FileAttributes)attributes);
}
public virtual void DeleteDirectory(string path, bool recursive = false)
{
RecursiveDelete(path);
}
/// <summary>
/// Lock specified directory, so it can't be deleted or renamed by any other process
/// </summary>
/// <param name="path">Path to existing directory junction</param>
public virtual SafeFileHandle LockDirectory(string path)
{
return NativeMethods.LockDirectory(path);
}
public virtual IEnumerable<DirectoryItemInfo> ItemsInDirectory(string path)
{
DirectoryInfo ntfsDirectory = new DirectoryInfo(path);
foreach (FileSystemInfo ntfsItem in ntfsDirectory.GetFileSystemInfos())
{
DirectoryItemInfo itemInfo = new DirectoryItemInfo()
{
FullName = ntfsItem.FullName,
Name = ntfsItem.Name,
IsDirectory = (ntfsItem.Attributes & FileAttributes.Directory) != 0
};
if (!itemInfo.IsDirectory)
{
itemInfo.Length = ((FileInfo)ntfsItem).Length;
}
yield return itemInfo;
}
}
public virtual FileProperties GetFileProperties(string path)
{
FileInfo entry = new FileInfo(path);
if (entry.Exists)
{
return new FileProperties(
entry.Attributes,
entry.CreationTimeUtc,
entry.LastAccessTimeUtc,
entry.LastWriteTimeUtc,
entry.Length);
}
else
{
return FileProperties.DefaultFile;
}
}
public virtual IDisposable MonitorChanges(
string directory,
NotifyFilters notifyFilter,
Action<FileSystemEventArgs> onCreate,
Action<RenamedEventArgs> onRename,
Action<FileSystemEventArgs> onDelete)
{
FileSystemWatcher watcher = new FileSystemWatcher(directory);
watcher.IncludeSubdirectories = true;
watcher.NotifyFilter = notifyFilter;
watcher.InternalBufferSize = WatcherBufferSize;
watcher.EnableRaisingEvents = true;
if (onCreate != null)
{
watcher.Created += (sender, args) => onCreate(args);
}
if (onRename != null)
{
watcher.Renamed += (sender, args) =>
{
// Skip the event if args.Name is null.
// Name can be null if the FileSystemWatcher's buffer has an entry for OLD_NAME that is not followed by an
// entry for NEW_NAME. This scenario results in two rename events being fired, the first with a null Name and the
// second with a null OldName.
if (args.Name != null)
{
onRename(args);
}
};
}
if (onDelete != null)
{
watcher.Deleted += (sender, args) => onDelete(args);
}
return watcher;
}
}
}

Просмотреть файл

@ -0,0 +1,49 @@
using System;
using System.IO;
using System.Threading.Tasks;
namespace GVFS.Common.Physical.FileSystem
{
public static class StreamReaderExtensions
{
private const int ReadWriteTimeoutMs = 10000;
private const int BufferSize = 64 * 1024;
public static void CopyBlockTo<TTimeoutException>(this StreamReader input, StreamWriter writer, long numBytes)
where TTimeoutException : TimeoutException, new()
{
char[] buffer = new char[BufferSize];
int read;
while (numBytes > 0)
{
int bytesToRead = Math.Min(buffer.Length, (int)numBytes);
read = input.ReadBlockAsync(buffer, 0, bytesToRead).Timeout<int, TTimeoutException>(ReadWriteTimeoutMs);
if (read <= 0)
{
break;
}
writer.WriteAsync(buffer, 0, read).Timeout<TTimeoutException>(ReadWriteTimeoutMs);
numBytes -= read;
}
}
public static async Task CopyBlockToAsync(this StreamReader input, StreamWriter writer, long numBytes)
{
char[] buffer = new char[BufferSize];
int read;
while (numBytes > 0)
{
int bytesToRead = Math.Min(buffer.Length, (int)Math.Min(numBytes, int.MaxValue));
read = await input.ReadBlockAsync(buffer, 0, bytesToRead);
if (read <= 0)
{
break;
}
await writer.WriteAsync(buffer, 0, read);
numBytes -= read;
}
}
}
}

Просмотреть файл

@ -0,0 +1,43 @@
using System.IO;
using System.Text;
namespace GVFS.Common.Physical.Git
{
public class BigEndianReader : BinaryReader
{
public BigEndianReader(Stream input)
: base(input, Encoding.Default, leaveOpen: true)
{
}
public override short ReadInt16()
{
return EndianHelper.Swap(base.ReadInt16());
}
public override int ReadInt32()
{
return EndianHelper.Swap(base.ReadInt32());
}
public override long ReadInt64()
{
return EndianHelper.Swap(base.ReadInt64());
}
public override ushort ReadUInt16()
{
return EndianHelper.Swap(base.ReadUInt16());
}
public override uint ReadUInt32()
{
return EndianHelper.Swap(base.ReadUInt32());
}
public override ulong ReadUInt64()
{
return EndianHelper.Swap(base.ReadUInt64());
}
}
}

Просмотреть файл

@ -0,0 +1,8 @@
using System;
namespace GVFS.Common.Physical.Git
{
public class CopyBlobContentTimeoutException : TimeoutException
{
}
}

Просмотреть файл

@ -0,0 +1,47 @@
namespace GVFS.Common.Physical.Git
{
public static class EndianHelper
{
public static short Swap(short source)
{
return (short)Swap((ushort)source);
}
public static int Swap(int source)
{
return (int)Swap((uint)source);
}
public static long Swap(long source)
{
return (long)((ulong)source);
}
public static ushort Swap(ushort source)
{
return (ushort)(((source & 0x000000FF) << 8) |
((source & 0x0000FF00) >> 8));
}
public static uint Swap(uint source)
{
return ((source & 0x000000FF) << 24)
| ((source & 0x0000FF00) << 8)
| ((source & 0x00FF0000) >> 8)
| ((source & 0xFF000000) >> 24);
}
public static ulong Swap(ulong source)
{
return
((source & 0x00000000000000FF) << 56)
| ((source & 0x000000000000FF00) << 40)
| ((source & 0x0000000000FF0000) << 24)
| ((source & 0x00000000FF000000) << 8)
| ((source & 0x000000FF00000000) >> 8)
| ((source & 0x0000FF0000000000) >> 24)
| ((source & 0x00FF000000000000) >> 40)
| ((source & 0xFF00000000000000) >> 56);
}
}
}

Просмотреть файл

@ -0,0 +1,117 @@
using GVFS.Common.Git;
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.IO;
using System.Runtime.InteropServices;
namespace GVFS.Common.Physical.Git
{
public class GVFSGitObjects : GitObjects
{
private static readonly TimeSpan NegativeCacheTTL = TimeSpan.FromSeconds(30);
private string objectsPath;
private ConcurrentDictionary<string, DateTime> objectNegativeCache;
public GVFSGitObjects(GVFSContext context, HttpGitObjects httpGitObjects)
: base(context.Tracer, context.Enlistment, httpGitObjects)
{
this.Context = context;
this.objectsPath = Path.Combine(context.Enlistment.WorkingDirectoryRoot, GVFSConstants.DotGit.Objects.Root);
this.objectNegativeCache = new ConcurrentDictionary<string, DateTime>(StringComparer.OrdinalIgnoreCase);
}
public virtual string HeadTreeSha
{
get { return this.Context.Repository.GetHeadTreeSha(); }
}
protected GVFSContext Context { get; private set; }
public virtual SafeHandle OpenGitObject(string firstTwoShaDigits, string remainingShaDigits)
{
return
this.OpenLooseObject(this.objectsPath, firstTwoShaDigits, remainingShaDigits)
?? this.DownloadObject(firstTwoShaDigits, remainingShaDigits);
}
public bool TryCopyBlobContentStream(string sha, Action<StreamReader, long> writeAction)
{
if (!this.Context.Repository.TryCopyBlobContentStream(sha, writeAction))
{
if (!this.TryDownloadAndSaveObject(sha.Substring(0, 2), sha.Substring(2)))
{
return false;
}
return this.Context.Repository.TryCopyBlobContentStream(sha, writeAction);
}
return true;
}
public bool TryDownloadAndSaveObject(string firstTwoShaDigits, string remainingShaDigits)
{
DateTime negativeCacheRequestTime;
string objectId = firstTwoShaDigits + remainingShaDigits;
if (this.objectNegativeCache.TryGetValue(objectId, out negativeCacheRequestTime))
{
if (negativeCacheRequestTime > DateTime.Now.Subtract(NegativeCacheTTL))
{
return false;
}
this.objectNegativeCache.TryRemove(objectId, out negativeCacheRequestTime);
}
DownloadAndSaveObjectResult result = this.TryDownloadAndSaveObject(objectId);
switch (result)
{
case DownloadAndSaveObjectResult.Success:
return true;
case DownloadAndSaveObjectResult.ObjectNotOnServer:
this.objectNegativeCache.AddOrUpdate(objectId, DateTime.Now, (unused1, unused2) => DateTime.Now);
return false;
case DownloadAndSaveObjectResult.Error:
return false;
default:
throw new InvalidOperationException("Unknown DownloadAndSaveObjectResult value");
}
}
public bool TryGetBlobSizeLocally(string sha, out long length)
{
return this.Context.Repository.TryGetBlobLength(sha, out length);
}
public List<HttpGitObjects.GitObjectSize> GetFileSizes(IEnumerable<string> objectIds)
{
return this.GitObjectRequestor.QueryForFileSizes(objectIds);
}
private SafeHandle OpenLooseObject(string objectsRoot, string firstTwoShaDigits, string remainingShaDigits)
{
string looseObjectPath = Path.Combine(
objectsRoot,
firstTwoShaDigits,
remainingShaDigits);
if (this.Context.FileSystem.FileExists(looseObjectPath))
{
return this.Context.FileSystem.OpenFile(looseObjectPath, FileMode.Open, (FileAccess)NativeMethods.FileAccess.FILE_GENERIC_READ, FileAttributes.Normal, FileShare.Read);
}
return null;
}
private SafeHandle DownloadObject(string firstTwoShaDigits, string remainingShaDigits)
{
this.TryDownloadAndSaveObject(firstTwoShaDigits, remainingShaDigits);
return this.OpenLooseObject(this.objectsPath, firstTwoShaDigits, remainingShaDigits);
}
}
}

Просмотреть файл

@ -0,0 +1,378 @@
using GVFS.Common.Physical.FileSystem;
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
namespace GVFS.Common.Physical.Git
{
public class GitIndex : IDisposable
{
private const ushort ExtendedBit = 0x4000;
private const ushort SkipWorktreeBit = 0x4000;
private const int BaseEntryLength = 62;
private const int MaxPathBufferSize = 4096;
private static readonly DateTime UnixEpoch = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
private Dictionary<string, long> pathOffsets;
private bool pathOffsetsIsInvalid;
private string indexPath;
private string lockPath;
private ITracer tracer;
private Enlistment enlistment;
private Stream indexFileStream;
private FileBasedLock gitIndexLock;
public GitIndex(ITracer tracer, Enlistment enlistment, string virtualIndexPath, string virtualIndexLockPath)
{
this.indexPath = virtualIndexPath;
this.lockPath = virtualIndexLockPath;
this.pathOffsetsIsInvalid = true;
this.tracer = tracer;
this.enlistment = enlistment;
}
public void Initialize()
{
this.gitIndexLock = new FileBasedLock(
new PhysicalFileSystem(),
this.tracer,
this.lockPath,
"GVFS",
FileBasedLock.ExistingLockCleanup.DeleteExisting);
}
public CallbackResult Open()
{
if (!File.Exists(this.indexPath))
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("ErrorMessage", "Can't open the index because it doesn't exist");
this.tracer.RelatedError(metadata);
return CallbackResult.FatalError;
}
if (!this.gitIndexLock.TryAcquireLockAndDeleteOnClose())
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
this.tracer.RelatedEvent(EventLevel.Verbose, "OpenCantAcquireIndexLock", metadata);
return CallbackResult.RetryableError;
}
CallbackResult result = CallbackResult.FatalError;
try
{
// TODO 667979: check if the index is missing and generate a new one if needed
this.indexFileStream = new FileStream(this.indexPath, FileMode.Open, FileAccess.ReadWrite, FileShare.Read);
result = CallbackResult.Success;
}
catch (IOException e)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "IOException in Open (RetryableError)");
this.tracer.RelatedError(metadata);
result = CallbackResult.RetryableError;
}
catch (Exception e)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "Exception in Open (FatalError)");
this.tracer.RelatedError(metadata);
result = CallbackResult.FatalError;
}
finally
{
if (result != CallbackResult.Success)
{
if (!this.gitIndexLock.TryReleaseLock())
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("ErrorMessage", "Unable to release index.lock in Open (FatalError)");
this.tracer.RelatedError(metadata);
result = CallbackResult.FatalError;
}
}
}
return result;
}
public CallbackResult Close()
{
if (this.indexFileStream != null)
{
this.indexFileStream.Dispose();
this.indexFileStream = null;
}
try
{
if (!this.gitIndexLock.IsOpen() ||
this.gitIndexLock.TryReleaseLock())
{
return CallbackResult.Success;
}
}
catch (Exception e)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "Fatal Exception in Close");
this.tracer.RelatedError(metadata);
}
return CallbackResult.FatalError;
}
public virtual CallbackResult ClearSkipWorktreeAndUpdateEntry(string filePath, DateTime createTimeUtc, DateTime lastWriteTimeUtc, uint fileSize)
{
try
{
if (this.pathOffsetsIsInvalid)
{
this.pathOffsetsIsInvalid = false;
this.ParseIndex();
if (this.pathOffsetsIsInvalid)
{
return CallbackResult.RetryableError;
}
}
string gitStyleFilePath = filePath.TrimStart(GVFSConstants.PathSeparator).Replace(GVFSConstants.PathSeparator, GVFSConstants.GitPathSeparator);
long offset;
if (this.pathOffsets.TryGetValue(gitStyleFilePath, out offset))
{
if (createTimeUtc == DateTime.MinValue ||
lastWriteTimeUtc == DateTime.MinValue ||
fileSize == 0)
{
try
{
FileInfo fileInfo = new FileInfo(Path.Combine(this.enlistment.WorkingDirectoryRoot, filePath));
if (fileInfo.Exists)
{
createTimeUtc = fileInfo.CreationTimeUtc;
lastWriteTimeUtc = fileInfo.LastWriteTimeUtc;
fileSize = (uint)fileInfo.Length;
}
}
catch (IOException e)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("filePath", filePath);
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "IOException caught while trying to get FileInfo for index entry");
this.tracer.RelatedError(metadata);
}
}
uint ctimeSeconds = this.ToUnixEpochSeconds(createTimeUtc);
uint ctimeNanosecondFraction = this.ToUnixNanosecondFraction(createTimeUtc);
uint mtimeSeconds = this.ToUnixEpochSeconds(lastWriteTimeUtc);
uint mtimeNanosecondFraction = this.ToUnixNanosecondFraction(lastWriteTimeUtc);
this.indexFileStream.Seek(offset, SeekOrigin.Begin);
this.indexFileStream.Write(BitConverter.GetBytes(EndianHelper.Swap(ctimeSeconds)), 0, 4); // ctime seconds
this.indexFileStream.Write(BitConverter.GetBytes(EndianHelper.Swap(ctimeNanosecondFraction)), 0, 4); // ctime nanosecond fractions
this.indexFileStream.Write(BitConverter.GetBytes(EndianHelper.Swap(mtimeSeconds)), 0, 4); // mtime seconds
this.indexFileStream.Write(BitConverter.GetBytes(EndianHelper.Swap(mtimeNanosecondFraction)), 0, 4); // mtime nanosecond fractions
this.indexFileStream.Seek(20, SeekOrigin.Current); // dev + ino + mode + uid + gid
this.indexFileStream.Write(BitConverter.GetBytes(EndianHelper.Swap(fileSize)), 0, 4); // size
this.indexFileStream.Seek(22, SeekOrigin.Current); // sha + flags
this.indexFileStream.Write(new byte[2] { 0, 0 }, 0, 2); // extended flags
this.indexFileStream.Flush();
this.pathOffsets.Remove(gitStyleFilePath);
}
}
catch (IOException e)
{
this.pathOffsetsIsInvalid = true;
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "IOException in ClearSkipWorktreeBitWhileHoldingIndexLock (RetryableError)");
this.tracer.RelatedError(metadata);
return CallbackResult.RetryableError;
}
catch (Exception e)
{
this.pathOffsetsIsInvalid = true;
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", "GitIndex");
metadata.Add("Exception", e.ToString());
metadata.Add("ErrorMessage", "Exception in ClearSkipWorktreeBitWhileHoldingIndexLock (FatalError)");
this.tracer.RelatedError(metadata);
return CallbackResult.FatalError;
}
return CallbackResult.Success;
}
public void Invalidate()
{
if (!this.gitIndexLock.IsOpen())
{
this.pathOffsetsIsInvalid = true;
}
}
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (disposing)
{
if (this.gitIndexLock != null)
{
this.gitIndexLock.Dispose();
this.gitIndexLock = null;
}
}
}
private uint ToUnixNanosecondFraction(DateTime datetime)
{
if (datetime > UnixEpoch)
{
TimeSpan timediff = datetime - UnixEpoch;
double nanoseconds = (timediff.TotalSeconds - Math.Truncate(timediff.TotalSeconds)) * 1000000000;
return Convert.ToUInt32(nanoseconds);
}
else
{
return 0;
}
}
private uint ToUnixEpochSeconds(DateTime datetime)
{
if (datetime > UnixEpoch)
{
return Convert.ToUInt32(Math.Truncate((datetime - UnixEpoch).TotalSeconds));
}
else
{
return 0;
}
}
private void ParseIndex()
{
this.pathOffsets = new Dictionary<string, long>(StringComparer.OrdinalIgnoreCase);
this.indexFileStream.Position = 0;
using (BigEndianReader reader = new BigEndianReader(this.indexFileStream))
{
reader.ReadBytes(4);
uint version = reader.ReadUInt32();
uint entryCount = reader.ReadUInt32();
int previousPathLength = 0;
byte[] pathBuffer = new byte[MaxPathBufferSize];
for (int i = 0; i < entryCount; i++)
{
// If the path offsets gets set as invalid we can bail
// since the index will have to be reparsed
if (this.pathOffsetsIsInvalid)
{
return;
}
long entryOffset = this.indexFileStream.Position;
int entryLength = BaseEntryLength;
reader.ReadBytes(60);
ushort flags = reader.ReadUInt16();
bool isExtended = (flags & ExtendedBit) == ExtendedBit;
int pathLength = (ushort)((flags << 20) >> 20);
entryLength += pathLength;
bool skipWorktree = false;
if (isExtended)
{
ushort extendedFlags = reader.ReadUInt16();
skipWorktree = (extendedFlags & SkipWorktreeBit) == SkipWorktreeBit;
entryLength += 2;
}
if (version == 4)
{
int replaceLength = this.ReadReplaceLength(reader);
byte ch;
int index = previousPathLength - replaceLength;
while ((ch = reader.ReadByte()) != '\0')
{
if (index >= pathBuffer.Length)
{
throw new InvalidOperationException("Git index path entry too large.");
}
pathBuffer[index] = ch;
++index;
}
previousPathLength = index;
if (skipWorktree)
{
this.pathOffsets[Encoding.UTF8.GetString(pathBuffer, 0, index)] = entryOffset;
}
}
else
{
byte[] path = reader.ReadBytes(pathLength);
int nullbytes = 8 - (entryLength % 8);
reader.ReadBytes(nullbytes);
if (skipWorktree)
{
this.pathOffsets[Encoding.UTF8.GetString(path)] = entryOffset;
}
}
}
}
}
private int ReadReplaceLength(BinaryReader reader)
{
int headerByte = reader.ReadByte();
int offset = headerByte & 0x7f;
// Terminate the loop when the high bit is no longer set.
for (int i = 0; (headerByte & 0x80) != 0; i++)
{
headerByte = reader.ReadByte();
offset += 1;
offset = (offset << 7) + (headerByte & 0x7f);
}
return offset;
}
}
}

Просмотреть файл

@ -0,0 +1,143 @@
using GVFS.Common.Git;
using GVFS.Common.Physical.FileSystem;
using GVFS.Common.Tracing;
using System;
using System.Collections.Generic;
using System.IO;
namespace GVFS.Common.Physical.Git
{
public class GitRepo : IDisposable
{
private string workingDirectoryPath;
private ITracer tracer;
private PhysicalFileSystem fileSystem;
private GitIndex index;
private ProcessPool<GitCatFileBatchProcess> catFileProcessPool;
private ProcessPool<GitCatFileBatchCheckProcess> batchCheckProcessPool;
public GitRepo(ITracer tracer, Enlistment enlistment, PhysicalFileSystem fileSystem, GitIndex index)
{
this.tracer = tracer;
this.workingDirectoryPath = enlistment.WorkingDirectoryRoot;
this.fileSystem = fileSystem;
this.index = index;
this.GVFSLock = new GVFSLock(tracer);
this.batchCheckProcessPool = new ProcessPool<GitCatFileBatchCheckProcess>(
tracer,
() => new GitCatFileBatchCheckProcess(enlistment),
Environment.ProcessorCount);
this.catFileProcessPool = new ProcessPool<GitCatFileBatchProcess>(
tracer,
() => new GitCatFileBatchProcess(enlistment),
Environment.ProcessorCount);
}
public GitIndex Index
{
get { return this.index; }
}
public GVFSLock GVFSLock
{
get;
private set;
}
public void Initialize()
{
this.Index.Initialize();
}
public virtual string GetHeadTreeSha()
{
return this.catFileProcessPool.Invoke(
catFile => catFile.GetTreeSha(GVFSConstants.HeadCommitName));
}
public virtual string GetHeadCommitId()
{
return this.catFileProcessPool.Invoke(
catFile => catFile.GetCommitId(GVFSConstants.HeadCommitName));
}
public virtual bool TryCopyBlobContentStream(string blobSha, Action<StreamReader, long> writeAction)
{
return this.catFileProcessPool.Invoke(
catFile => catFile.TryCopyBlobContentStream(blobSha, writeAction));
}
public virtual bool TryGetBlobLength(string blobSha, out long size)
{
long? output = this.batchCheckProcessPool.Invoke<long?>(
catFileBatch =>
{
long value;
if (catFileBatch.TryGetObjectSize(blobSha, out value))
{
return value;
}
return null;
});
if (output.HasValue)
{
size = output.Value;
return true;
}
size = 0;
return false;
}
public virtual bool TryGetFileSha(string commitId, string virtualPath, out string sha)
{
sha = this.catFileProcessPool.Invoke(
catFile =>
{
string innerSha;
if (catFile.TryGetFileSha(commitId, virtualPath, out innerSha))
{
return innerSha;
}
return null;
});
return !string.IsNullOrWhiteSpace(sha);
}
public virtual IEnumerable<GitTreeEntry> GetTreeEntries(string commitId, string path)
{
return this.catFileProcessPool.Invoke(catFile => catFile.GetTreeEntries(commitId, path));
}
public virtual IEnumerable<GitTreeEntry> GetTreeEntries(string sha)
{
return this.catFileProcessPool.Invoke(catFile => catFile.GetTreeEntries(sha));
}
public void Dispose()
{
if (this.catFileProcessPool != null)
{
this.catFileProcessPool.Dispose();
}
if (this.batchCheckProcessPool != null)
{
this.batchCheckProcessPool.Dispose();
}
if (this.index != null)
{
this.index.Dispose();
}
}
}
}

Просмотреть файл

@ -0,0 +1,33 @@
using Microsoft.Win32;
namespace GVFS.Common.Physical
{
public class RegistryUtils
{
public static string GetStringFromRegistry(RegistryHive registryHive, string key, string valueName)
{
string value = GetStringFromRegistry(registryHive, key, valueName, RegistryView.Registry64);
if (value == null)
{
value = GetStringFromRegistry(registryHive, key, valueName, RegistryView.Registry32);
}
return value;
}
private static string GetStringFromRegistry(RegistryHive registryHive, string key, string valueName, RegistryView view)
{
RegistryKey localKey = RegistryKey.OpenBaseKey(registryHive, view);
var localKeySub = localKey.OpenSubKey(key);
object value = localKeySub == null ? null : localKeySub.GetValue(valueName);
if (value == null)
{
return null;
}
return (string)value;
}
}
}

Просмотреть файл

@ -0,0 +1,139 @@
using GVFS.Common.Tracing;
using Microsoft.Isam.Esent.Collections.Generic;
using System;
using System.IO;
namespace GVFS.Common.Physical
{
public class RepoMetadata : IDisposable
{
private PersistentDictionary<string, string> repoMetadata;
public RepoMetadata(string dotGVFSPath)
{
this.repoMetadata = new PersistentDictionary<string, string>(
Path.Combine(dotGVFSPath, GVFSConstants.DatabaseNames.RepoMetadata));
}
public static int GetCurrentDiskLayoutVersion()
{
return DiskLayoutVersion.CurrentDiskLayoutVerion;
}
public static bool CheckDiskLayoutVersion(string dotGVFSPath, out string error)
{
if (!Directory.Exists(Path.Combine(dotGVFSPath, GVFSConstants.DatabaseNames.RepoMetadata)))
{
error = DiskLayoutVersion.MissingVersionError;
return false;
}
try
{
using (RepoMetadata repoMetadata = new RepoMetadata(dotGVFSPath))
{
return repoMetadata.CheckDiskLayoutVersion(out error);
}
}
catch (Exception e)
{
error = "Failed to check disk layout version of enlistment, Exception: " + e.ToString();
return false;
}
}
public void SaveCurrentDiskLayoutVersion()
{
DiskLayoutVersion.SaveCurrentDiskLayoutVersion(this.repoMetadata);
}
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
protected void Dispose(bool disposing)
{
if (this.repoMetadata != null)
{
this.repoMetadata.Dispose();
this.repoMetadata = null;
}
}
private bool CheckDiskLayoutVersion(out string error)
{
return DiskLayoutVersion.CheckDiskLayoutVersion(this.repoMetadata, out error);
}
private static class DiskLayoutVersion
{
// The current disk layout version. This number should be bumped whenever a disk format change is made
// that would impact and older GVFS's ability to mount the repo
public const int CurrentDiskLayoutVerion = 3;
public const string MissingVersionError = "Enlistment disk layout version not found, check if a breaking change has been made to GVFS since cloning this enlistment.";
private const string DiskLayoutVersionKey = "DiskLayoutVersion";
// MaxDiskLayoutVersion ensures that olders versions of GVFS will not try to mount newer enlistments (if the
// disk layout of the newer GVFS is incompatible).
// GVFS will only mount if the disk layout version of the repo is <= MaxDiskLayoutVersion
private const int MaxDiskLayoutVersion = CurrentDiskLayoutVerion;
// MinDiskLayoutVersion ensures that GVFS will not attempt to mount an older repo if there has been a breaking format
// change since that enlistment was cloned.
// - GVFS will only mount if the disk layout version of the repo is >= MinDiskLayoutVersion
// - Bump this version number only when a breaking change is being made (i.e. upgrade is not supported)
private const int MinDiskLayoutVersion = 3;
public static void SaveCurrentDiskLayoutVersion(PersistentDictionary<string, string> repoMetadata)
{
repoMetadata[DiskLayoutVersionKey] = CurrentDiskLayoutVerion.ToString();
repoMetadata.Flush();
}
public static bool CheckDiskLayoutVersion(PersistentDictionary<string, string> repoMetadata, out string error)
{
error = string.Empty;
string value;
if (repoMetadata.TryGetValue(DiskLayoutVersionKey, out value))
{
int persistedVersionNumber;
if (!int.TryParse(value, out persistedVersionNumber))
{
error = "Failed to parse persisted disk layout version number";
return false;
}
if (persistedVersionNumber < MinDiskLayoutVersion)
{
error = string.Format(
"Breaking change to GVFS disk layout has been made since cloning. \r\nEnlistment disk layout version: {0} \r\nGVFS disk layout version: {1} \r\nMinimum supported version: {2}",
persistedVersionNumber,
CurrentDiskLayoutVerion,
MinDiskLayoutVersion);
return false;
}
else if (persistedVersionNumber > MaxDiskLayoutVersion)
{
error = string.Format(
"Changes to GVFS disk layout do not allow mounting after downgrade. Try mounting again using a more recent version of GVFS. \r\nEnlistment disk layout version: {0} \r\nGVFS disk layout version: {1}",
persistedVersionNumber,
CurrentDiskLayoutVerion);
return false;
}
}
else
{
error = MissingVersionError;
return false;
}
return true;
}
}
}
}

Просмотреть файл

@ -0,0 +1,126 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
namespace GVFS.Common
{
/// <summary>
/// Deserializer for packs and indexes for prefetch packs.
/// </summary>
public class PrefetchPacksDeserializer
{
private const int NumPackHeaderBytes = 3 * sizeof(long);
private static readonly byte[] PrefetchPackExpectedHeader
= new byte[]
{
(byte)'G', (byte)'P', (byte)'R', (byte)'E', (byte)' ', // Magic
1 // Version
};
private readonly Stream source;
public PrefetchPacksDeserializer(
Stream source)
{
this.source = source;
}
/// <summary>
/// Read all the packs and indexes from the source stream and return a <see cref="PackAndIndex"/> for each pack
/// and index. Caller must consume pack stream fully before the index stream.
/// </summary>
public IEnumerable<PackAndIndex> EnumeratePacks()
{
this.ValidateHeader();
// Start reading objects
byte[] buffer = new byte[NumPackHeaderBytes];
int packCount = this.ReadPackCount(buffer);
for (int i = 0; i < packCount; i++)
{
long timestamp;
long packLength;
long indexLength;
this.ReadPackHeader(buffer, out timestamp, out packLength, out indexLength);
using (Stream packData = new RestrictedStream(this.source, 0, packLength, leaveOpen: true))
using (Stream indexData = indexLength > 0 ? new RestrictedStream(this.source, 0, indexLength, leaveOpen: true) : null)
{
yield return new PackAndIndex(packData, indexData, timestamp);
}
}
}
/// <summary>
/// Read the ushort pack count
/// </summary>
private ushort ReadPackCount(byte[] buffer)
{
StreamUtil.TryReadGreedy(this.source, buffer, 0, 2);
return BitConverter.ToUInt16(buffer, 0);
}
/// <summary>
/// Parse the current pack header
/// </summary>
private void ReadPackHeader(
byte[] buffer,
out long timestamp,
out long packLength,
out long indexLength)
{
int totalBytes = StreamUtil.TryReadGreedy(
this.source,
buffer,
0,
NumPackHeaderBytes);
if (totalBytes == NumPackHeaderBytes)
{
timestamp = BitConverter.ToInt64(buffer, 0);
packLength = BitConverter.ToInt64(buffer, 8);
indexLength = BitConverter.ToInt64(buffer, 16);
}
else
{
throw new RetryableException(
string.Format(
"Reached end of stream before expected {0} bytes. Got {1}. Buffer: {2}",
NumPackHeaderBytes,
totalBytes,
SHA1Util.HexStringFromBytes(buffer)));
}
}
private void ValidateHeader()
{
byte[] headerBuf = new byte[PrefetchPackExpectedHeader.Length];
StreamUtil.TryReadGreedy(this.source, headerBuf, 0, headerBuf.Length);
if (!headerBuf.SequenceEqual(PrefetchPackExpectedHeader))
{
throw new InvalidDataException("Unexpected header: " + Encoding.UTF8.GetString(headerBuf));
}
}
public class PackAndIndex
{
public PackAndIndex(Stream packStream, Stream idxStream, long timestamp)
{
this.PackStream = packStream;
this.IndexStream = idxStream;
this.Timestamp = timestamp;
this.UniqueId = Guid.NewGuid().ToString("N");
}
public Stream PackStream { get; }
public Stream IndexStream { get; }
public long Timestamp { get; }
public string UniqueId { get; }
}
}
}

Просмотреть файл

@ -0,0 +1,291 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Management;
using System.Reflection;
using System.Runtime.InteropServices;
using System.Security.Principal;
namespace GVFS.Common
{
public static class ProcessHelper
{
/// <summary>
/// Get the process Id for the highest process with the given name in the current process hierarchy.
/// </summary>
/// <param name="parentName">The name of the parent process to consider (e.g. git.exe)</param>
/// <returns>The process Id or -1 if not found.</returns>
public static int GetParentProcessId(string parentName)
{
Dictionary<int, Process> processesSnapshot = Process.GetProcesses().ToDictionary(p => p.Id);
int highestParentId = GVFSConstants.InvalidProcessId;
Process currentProcess = Process.GetCurrentProcess();
while (true)
{
ProcessBasicInformation processBasicInfo;
int size;
int result =
NtQueryInformationProcess(
currentProcess.Handle,
0, // Denotes ProcessBasicInformation
out processBasicInfo,
Marshal.SizeOf(typeof(ProcessBasicInformation)),
out size);
int potentialParentId = processBasicInfo.InheritedFromUniqueProcessId.ToInt32();
if (result != 0 || potentialParentId == 0)
{
return GetProcessIdIfHasName(highestParentId, parentName);
}
Process processFound;
if (processesSnapshot.TryGetValue(potentialParentId, out processFound))
{
if (processFound.MainModule.ModuleName.Equals(parentName, StringComparison.OrdinalIgnoreCase))
{
highestParentId = potentialParentId;
}
else if (highestParentId > 0)
{
return GetProcessIdIfHasName(highestParentId, parentName);
}
}
else
{
if (highestParentId > 0)
{
return GetProcessIdIfHasName(highestParentId, parentName);
}
return GVFSConstants.InvalidProcessId;
}
currentProcess = Process.GetProcessById(potentialParentId);
}
}
public static bool TryGetProcess(int processId, out Process process)
{
try
{
process = Process.GetProcessById(processId);
return true;
}
catch (ArgumentException)
{
process = null;
return false;
}
}
public static ProcessResult Run(string programName, string args, bool redirectOutput = true)
{
ProcessStartInfo processInfo = new ProcessStartInfo(programName);
processInfo.UseShellExecute = false;
processInfo.RedirectStandardInput = true;
processInfo.RedirectStandardOutput = redirectOutput;
processInfo.RedirectStandardError = redirectOutput;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
processInfo.Arguments = args;
return Run(processInfo);
}
public static void StartBackgroundProcess(string programName, string args, bool createWindow)
{
ProcessStartInfo processInfo = new ProcessStartInfo(programName, args);
if (createWindow)
{
processInfo.WindowStyle = ProcessWindowStyle.Minimized;
}
else
{
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
Process executingProcess = new Process();
executingProcess.StartInfo = processInfo;
executingProcess.Start();
}
public static string GetCurrentProcessLocation()
{
Assembly assembly = Assembly.GetExecutingAssembly();
return Path.GetDirectoryName(assembly.Location);
}
public static string GetEntryClassName()
{
Assembly assembly = Assembly.GetEntryAssembly();
if (assembly == null)
{
// The PR build tests doesn't produce an entry assembly because it is run from unmanaged code,
// so we'll fall back on using this assembly. This should never ever happen for a normal exe invocation.
assembly = Assembly.GetExecutingAssembly();
}
return assembly.GetName().Name;
}
public static string GetCurrentProcessVersion()
{
Assembly assembly = Assembly.GetExecutingAssembly();
FileVersionInfo fileVersionInfo = FileVersionInfo.GetVersionInfo(assembly.Location);
return fileVersionInfo.ProductVersion;
}
public static bool IsAdminElevated()
{
using (WindowsIdentity id = WindowsIdentity.GetCurrent())
{
return new WindowsPrincipal(id).IsInRole(WindowsBuiltInRole.Administrator);
}
}
public static string WhereDirectory(string processName)
{
ProcessResult result = ProcessHelper.Run("where", processName);
if (result.ExitCode != 0)
{
return null;
}
string firstPath =
string.IsNullOrWhiteSpace(result.Output)
? null
: result.Output.Split(new[] { '\r', '\n' }, StringSplitOptions.RemoveEmptyEntries).FirstOrDefault();
if (firstPath == null)
{
return null;
}
try
{
return Path.GetDirectoryName(firstPath);
}
catch (IOException)
{
return null;
}
}
public static ProcessResult Run(ProcessStartInfo processInfo, string errorMsgDelimeter = "\r\n", object executionLock = null)
{
using (Process executingProcess = new Process())
{
string output = string.Empty;
string errors = string.Empty;
// From https://msdn.microsoft.com/en-us/library/system.diagnostics.process.standardoutput.aspx
// To avoid deadlocks, use asynchronous read operations on at least one of the streams.
// Do not perform a synchronous read to the end of both redirected streams.
executingProcess.StartInfo = processInfo;
executingProcess.ErrorDataReceived += (sender, args) =>
{
if (args.Data != null)
{
errors = errors + args.Data + errorMsgDelimeter;
}
};
if (executionLock != null)
{
lock (executionLock)
{
output = StartProcess(executingProcess);
}
}
else
{
output = StartProcess(executingProcess);
}
return new ProcessResult(output.ToString(), errors.ToString(), executingProcess.ExitCode);
}
}
public static string GetCommandLine(Process process)
{
using (ManagementObjectSearcher wmiSearch =
new ManagementObjectSearcher("SELECT CommandLine FROM Win32_Process WHERE ProcessId = " + process.Id))
{
foreach (ManagementBaseObject commandLineObject in wmiSearch.Get())
{
return process.StartInfo.FileName + " " + commandLineObject["CommandLine"];
}
}
return string.Empty;
}
private static int GetProcessIdIfHasName(int processId, string expectedName)
{
if (ProcessIdHasName(processId, expectedName))
{
return processId;
}
else
{
return GVFSConstants.InvalidProcessId;
}
}
private static bool ProcessIdHasName(int processId, string expectedName)
{
Process process;
if (TryGetProcess(processId, out process))
{
return process.MainModule.ModuleName.Equals(expectedName, StringComparison.OrdinalIgnoreCase);
}
return false;
}
private static string StartProcess(Process executingProcess)
{
executingProcess.Start();
if (executingProcess.StartInfo.RedirectStandardError)
{
executingProcess.BeginErrorReadLine();
}
string output = string.Empty;
if (executingProcess.StartInfo.RedirectStandardOutput)
{
output = executingProcess.StandardOutput.ReadToEnd();
}
executingProcess.WaitForExit();
return output;
}
[DllImport("ntdll.dll")]
private static extern int NtQueryInformationProcess(
IntPtr processHandle,
int processInformationClass,
out ProcessBasicInformation processInformation,
int processInformationLength,
out int returnLength);
[DllImport("kernel32.dll")]
private static extern IntPtr GetConsoleWindow();
[StructLayout(LayoutKind.Sequential)]
private struct ProcessBasicInformation
{
public IntPtr ExitStatus;
public IntPtr PebBaseAddress;
public IntPtr AffinityMask;
public IntPtr BasePriority;
public UIntPtr UniqueProcessId;
public IntPtr InheritedFromUniqueProcessId;
}
}
}

Просмотреть файл

@ -0,0 +1,158 @@
using GVFS.Common.Git;
using GVFS.Common.Tracing;
using System;
using System.Collections.Concurrent;
using System.Diagnostics;
using System.Threading;
namespace GVFS.Common
{
public class ProcessPool<TProcess> : IDisposable where TProcess : GitCatFileProcess
{
private const int TryAddTimeoutMilliseconds = 10;
private const int TryTakeTimeoutMilliseconds = 10;
private const int IdleSecondsBeforeCleanup = 10;
// To help the idle processes to get cleaned up close to when the process passes the IdleSecondsBeforeCleanup
// we set the timer to pop at half the IdleSecondsBeforeCleanup value
private const int CleanupTimerPeriodMilliseconds = IdleSecondsBeforeCleanup * 1000 / 2;
private readonly Func<TProcess> createProcess;
private readonly BlockingCollection<RunningProcess> pool;
private readonly ITracer tracer;
private readonly Timer cleanupTimer;
public ProcessPool(ITracer tracer, Func<TProcess> createProcess, int size)
{
Debug.Assert(size > 0, "ProcessPool: size must be greater than 0");
this.tracer = tracer;
this.createProcess = createProcess;
this.pool = new BlockingCollection<RunningProcess>(size);
this.cleanupTimer = new Timer(x => this.CleanUpPool(shutdownAllProcesses: false), null, 0, CleanupTimerPeriodMilliseconds);
}
public void Dispose()
{
this.cleanupTimer.Change(Timeout.Infinite, Timeout.Infinite);
this.cleanupTimer.Dispose();
this.pool.CompleteAdding();
this.CleanUpPool(shutdownAllProcesses: true);
}
public void Invoke(Action<TProcess> function)
{
this.Invoke(process => { function(process); return false; });
}
public TResult Invoke<TResult>(Func<TProcess, TResult> function)
{
TProcess process = null;
bool returnToPool = true;
try
{
process = this.GetRunningProcessFromPool();
TResult result = function(process);
// Retry once if the process crashed while we were running it
if (!process.IsRunning())
{
process = this.GetRunningProcessFromPool();
result = function(process);
}
return result;
}
catch
{
returnToPool = false;
throw;
}
finally
{
if (returnToPool)
{
this.ReturnToPool(process);
}
else
{
process.Kill();
}
}
}
private TProcess GetRunningProcessFromPool()
{
RunningProcess poolProcess;
if (this.pool.TryTake(out poolProcess, TryTakeTimeoutMilliseconds))
{
return poolProcess.Process;
}
else
{
return this.createProcess();
}
}
private void ReturnToPool(TProcess process)
{
if (process != null && process.IsRunning())
{
if (this.pool.IsAddingCompleted ||
!this.pool.TryAdd(new RunningProcess(process), TryAddTimeoutMilliseconds))
{
// No more adding to the pool or trying to add to the pool failed
process.Kill();
}
}
}
private void CleanUpPool(bool shutdownAllProcesses)
{
int numberInPool = this.pool.Count;
for (int i = 0; i < numberInPool; i++)
{
RunningProcess poolProcess;
if (this.pool.TryTake(out poolProcess))
{
if (shutdownAllProcesses || this.pool.IsAddingCompleted)
{
poolProcess.Dispose();
}
else if (poolProcess.Process.IsRunning() &&
poolProcess.LastUsed.AddSeconds(IdleSecondsBeforeCleanup) > DateTime.Now)
{
this.pool.TryAdd(poolProcess, TryAddTimeoutMilliseconds);
}
else
{
// Process is either not running or has been idle too long
poolProcess.Dispose();
}
}
}
}
private class RunningProcess : IDisposable
{
public RunningProcess(TProcess process)
{
this.Process = process;
this.LastUsed = DateTime.Now;
}
public TProcess Process { get; private set; }
public DateTime LastUsed { get; }
public void Dispose()
{
if (this.Process != null)
{
this.Process.Dispose();
this.Process = null;
}
}
}
}
}

Просмотреть файл

@ -0,0 +1,16 @@
namespace GVFS.Common
{
public class ProcessResult
{
public ProcessResult(string output, string errors, int exitCode)
{
this.Output = output;
this.Errors = errors;
this.ExitCode = exitCode;
}
public string Output { get; }
public string Errors { get; }
public int ExitCode { get; }
}
}

Просмотреть файл

@ -0,0 +1,22 @@
using System.Reflection;
using System.Runtime.InteropServices;
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("GVFS.Common")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("GVFS.Common")]
[assembly: AssemblyCopyright("Copyright © Microsoft 2016")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]
// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("9ea6ff63-6bb0-4440-9bfb-0ae79a8f9ba9")]

Просмотреть файл

@ -0,0 +1,367 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using Microsoft.Isam.Esent.Collections.Generic;
using System;
using System.Collections.Concurrent;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace GVFS.Common
{
public class ReliableBackgroundOperations<TBackgroundOperation> : IDisposable where TBackgroundOperation : IBackgroundOperation
{
private const int ActionRetryDelayMS = 50;
private const int MaxCallbackAttemptsOnShutdown = 5;
private const int LogUpdateTaskThreshold = 25000;
private static readonly string EtwArea = "ProcessBackgroundOperations";
private readonly ReaderWriterLockSlim acquisitionLock;
private PersistentDictionary<Guid, TBackgroundOperation> persistence;
private ConcurrentQueue<TBackgroundOperation> backgroundOperations;
private AutoResetEvent wakeUpThread;
private Task backgroundThread;
private bool isStopping;
private GVFSContext context;
private Func<CallbackResult> preCallback;
private Func<TBackgroundOperation, CallbackResult> callback;
private Func<CallbackResult> postCallback;
public ReliableBackgroundOperations(
GVFSContext context,
Func<CallbackResult> preCallback,
Func<TBackgroundOperation, CallbackResult> callback,
Func<CallbackResult> postCallback,
string databaseName)
{
this.acquisitionLock = new ReaderWriterLockSlim();
this.persistence = new PersistentDictionary<Guid, TBackgroundOperation>(
Path.Combine(context.Enlistment.DotGVFSRoot, databaseName));
this.backgroundOperations = new ConcurrentQueue<TBackgroundOperation>();
this.wakeUpThread = new AutoResetEvent(true);
this.context = context;
this.preCallback = preCallback;
this.callback = callback;
this.postCallback = postCallback;
}
private enum AcquireGitLockResult
{
LockAcquired,
ShuttingDown
}
public int Count
{
get { return this.backgroundOperations.Count; }
}
public void Start()
{
this.EnqueueSavedOperations();
this.backgroundThread = Task.Run((Action)this.ProcessBackgroundOperations);
if (this.backgroundOperations.Count > 0)
{
this.wakeUpThread.Set();
}
}
public void Enqueue(TBackgroundOperation backgroundOperation)
{
this.persistence[backgroundOperation.Id] = backgroundOperation;
this.persistence.Flush();
if (!this.isStopping)
{
this.backgroundOperations.Enqueue(backgroundOperation);
this.wakeUpThread.Set();
}
}
public void Shutdown()
{
this.isStopping = true;
this.wakeUpThread.Set();
this.backgroundThread.Wait();
}
public void ObtainAcquisitionLock()
{
this.acquisitionLock.EnterReadLock();
}
public void ReleaseAcquisitionLock()
{
if (this.acquisitionLock.IsReadLockHeld)
{
this.acquisitionLock.ExitReadLock();
}
}
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
protected void Dispose(bool disposing)
{
if (this.persistence != null)
{
this.persistence.Dispose();
this.persistence = null;
}
if (this.backgroundThread != null)
{
this.backgroundThread.Dispose();
this.backgroundThread = null;
}
}
private void EnqueueSavedOperations()
{
foreach (Guid operationId in this.persistence.Keys)
{
// We are setting the Id here because there may be old operations that
// were persisted without the Id begin set in the background operation object
TBackgroundOperation backgroundOperation = this.persistence[operationId];
backgroundOperation.Id = operationId;
this.backgroundOperations.Enqueue(backgroundOperation);
}
}
private AcquireGitLockResult WaitToAcquireGitLock()
{
while (!this.context.Repository.GVFSLock.TryAcquireLock())
{
if (this.isStopping)
{
return AcquireGitLockResult.ShuttingDown;
}
Thread.Sleep(ActionRetryDelayMS);
}
return AcquireGitLockResult.LockAcquired;
}
private void ReleaseGitLockIfNecessary()
{
try
{
// Only release GVFS lock if the queue is empty. If it's not empty then another thread
// added something to the queue, allow it to continue processing.
while (this.backgroundOperations.IsEmpty)
{
// An external caller (eg. GVFLT callback) will hold reader status while adding something to the queue.
// If unable to enter writer status, wait and try again if the queue is still empty.
if (this.acquisitionLock.TryEnterWriteLock(millisecondsTimeout: 10))
{
this.context.Repository.GVFSLock.ReleaseLock();
break;
}
Thread.Sleep(millisecondsTimeout: 10);
}
}
catch (Exception e)
{
this.LogErrorAndExit("gitLock.TryReleaseLock threw Exception, shutting down", e);
}
finally
{
if (this.acquisitionLock.IsWriteLockHeld)
{
this.acquisitionLock.ExitWriteLock();
}
}
}
private void ProcessBackgroundOperations()
{
TBackgroundOperation backgroundOperation;
while (true)
{
AcquireGitLockResult acquireLockResult = AcquireGitLockResult.ShuttingDown;
try
{
this.wakeUpThread.WaitOne();
if (this.isStopping)
{
return;
}
acquireLockResult = this.WaitToAcquireGitLock();
switch (acquireLockResult)
{
case AcquireGitLockResult.LockAcquired:
break;
case AcquireGitLockResult.ShuttingDown:
return;
default:
this.LogErrorAndExit("Invalid AcquireGitLockResult result");
return;
}
this.RunCallbackUntilSuccess(this.preCallback, "PreCallback");
int tasksProcessed = 0;
while (this.backgroundOperations.TryPeek(out backgroundOperation))
{
if (tasksProcessed % LogUpdateTaskThreshold == 0 &&
(tasksProcessed >= LogUpdateTaskThreshold || this.backgroundOperations.Count >= LogUpdateTaskThreshold))
{
this.LogTaskProcessingStatus(tasksProcessed);
}
if (this.isStopping)
{
// If we are stopping, then GVFlt has already been shut down
// Some of the queued background tasks may require GVFlt, and so it is unsafe to
// proceed. GVFS will resume any queued tasks next time it is mounted
this.persistence.Flush();
return;
}
CallbackResult callbackResult = this.callback(backgroundOperation);
switch (callbackResult)
{
case CallbackResult.Success:
this.backgroundOperations.TryDequeue(out backgroundOperation);
this.persistence.Remove(backgroundOperation.Id);
++tasksProcessed;
break;
case CallbackResult.RetryableError:
if (!this.isStopping)
{
Thread.Sleep(ActionRetryDelayMS);
}
break;
case CallbackResult.FatalError:
this.LogErrorAndExit("Callback encountered fatal error, exiting process");
break;
default:
this.LogErrorAndExit("Invalid background operation result");
break;
}
}
this.persistence.Flush();
if (tasksProcessed >= LogUpdateTaskThreshold)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("BackgroundOperations", EtwArea);
metadata.Add("TasksProcessed", tasksProcessed);
metadata.Add("Message", "Processing background tasks complete");
this.context.Tracer.RelatedEvent(EventLevel.Informational, "TaskProcessingStatus", metadata);
}
if (this.isStopping)
{
return;
}
}
catch (Exception e)
{
this.LogErrorAndExit("ProcessBackgroundOperations caught unhandled exception, exiting process", e);
}
finally
{
if (acquireLockResult == AcquireGitLockResult.LockAcquired)
{
this.RunCallbackUntilSuccess(this.postCallback, "PostCallback");
this.ReleaseGitLockIfNecessary();
}
}
}
}
private void RunCallbackUntilSuccess(Func<CallbackResult> callback, string errorHeader)
{
while (true)
{
CallbackResult callbackResult = callback();
switch (callbackResult)
{
case CallbackResult.Success:
return;
case CallbackResult.RetryableError:
if (this.isStopping)
{
return;
}
Thread.Sleep(ActionRetryDelayMS);
break;
case CallbackResult.FatalError:
this.LogErrorAndExit(errorHeader + " encountered fatal error, exiting process");
return;
default:
this.LogErrorAndExit(errorHeader + " result could not be found");
return;
}
}
}
private void LogWarning(string message)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", EtwArea);
metadata.Add("Message", message);
this.context.Tracer.RelatedEvent(EventLevel.Warning, "Warning", metadata);
}
private void LogError(string message, Exception e = null)
{
this.LogError(message, e, exit: false);
}
private void LogErrorAndExit(string message, Exception e = null)
{
this.LogError(message, e, exit: true);
}
private void LogError(string message, Exception e, bool exit)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Area", EtwArea);
if (e != null)
{
metadata.Add("Exception", e.ToString());
}
metadata.Add("ErrorMessage", message);
this.context.Tracer.RelatedError(metadata);
if (exit)
{
Environment.Exit(1);
}
}
private void LogTaskProcessingStatus(int tasksProcessed)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("BackgroundOperations", EtwArea);
metadata.Add("TasksProcessed", tasksProcessed);
metadata.Add("TasksRemaining", this.backgroundOperations.Count);
this.context.Tracer.RelatedEvent(EventLevel.Informational, "TaskProcessingStatus", metadata);
}
}
}

Просмотреть файл

@ -0,0 +1,257 @@
using GVFS.Common.Tracing;
using Microsoft.Diagnostics.Tracing;
using System;
using System.IO;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using System.Web;
namespace GVFS.Common
{
public class RetryWrapper<T>
{
private const float MaxBackoffInSeconds = 300; // 5 minutes
private const float DefaultExponentialBackoffBase = 2;
private readonly int maxRetries;
private readonly float exponentialBackoffBase;
private Random rng = new Random();
public RetryWrapper(int maxRetries, float exponentialBackoffBase = DefaultExponentialBackoffBase)
{
this.maxRetries = maxRetries;
this.exponentialBackoffBase = exponentialBackoffBase;
}
public event Action<ErrorEventArgs> OnFailure = delegate { };
public static Action<ErrorEventArgs> StandardErrorHandler(ITracer tracer, string actionName)
{
return eArgs =>
{
EventMetadata metadata = new EventMetadata();
metadata.Add("AttemptNumber", eArgs.TryCount);
metadata.Add("Operation", actionName);
metadata.Add("WillRetry", eArgs.WillRetry);
metadata.Add("ErrorMessage", eArgs.Error != null ? eArgs.Error.Message : null);
tracer.RelatedError(metadata, Keywords.Network);
// Emit with stack at a higher verbosity.
metadata["ErrorMessage"] = eArgs.Error != null ? eArgs.Error.ToString() : null;
tracer.RelatedEvent(EventLevel.Verbose, JsonEtwTracer.NetworkErrorEventName, metadata, Keywords.Network);
};
}
public async Task<InvocationResult> InvokeAsync(Func<int, Task<CallbackResult>> toInvoke)
{
// Use 1-based counting. This makes reporting look a lot nicer and saves a lot of +1s
for (int tryCount = 1; tryCount <= this.maxRetries; ++tryCount)
{
try
{
CallbackResult result = await toInvoke(tryCount);
if (result.HasErrors)
{
if (!this.ShouldRetry(tryCount, null, result))
{
return new InvocationResult(tryCount, result.Error, result.Result);
}
}
else
{
return new InvocationResult(tryCount, true, result.Result);
}
}
catch (Exception e)
{
Exception exceptionToReport =
e is AggregateException
? ((AggregateException)e).Flatten().InnerException
: e;
if (!this.IsHandlableException(exceptionToReport))
{
throw;
}
if (!this.ShouldRetry(tryCount, exceptionToReport, null))
{
return new InvocationResult(tryCount, exceptionToReport);
}
}
// Don't wait for the first retry, since it might just be transient.
// Don't wait after the last try. tryCount is 1-based, so last attempt is tryCount == maxRetries
if (tryCount > 1 && tryCount < this.maxRetries)
{
// Exponential backoff
double backOffSeconds = Math.Min(Math.Pow(this.exponentialBackoffBase, tryCount), MaxBackoffInSeconds);
// Timeout usually happens when the server is overloaded. If we give all machines the same timeout they will all make
// another request at approximately the same time causing the problem to happen again and again. To avoid that we
// introduce a random timeout. To avoid scaling it too high or too low, it is +- 10% of the average backoff
backOffSeconds *= .9 + (this.rng.NextDouble() * .2);
await Task.Delay(TimeSpan.FromSeconds(backOffSeconds));
}
}
// This shouldn't be hit because ShouldRetry will cause a more useful message first.
return new InvocationResult(this.maxRetries, new Exception("Unexpected failure after retrying"));
}
public InvocationResult Invoke(Func<int, CallbackResult> toInvoke)
{
// Use 1-based counting. This makes reporting look a lot nicer and saves a lot of +1s
for (int tryCount = 1; tryCount <= this.maxRetries; ++tryCount)
{
try
{
CallbackResult result = toInvoke(tryCount);
if (result.HasErrors)
{
if (!this.ShouldRetry(tryCount, null, result))
{
return new InvocationResult(tryCount, result.Error, result.Result);
}
}
else
{
return new InvocationResult(tryCount, true, result.Result);
}
}
catch (Exception e)
{
Exception exceptionToReport =
e is AggregateException
? ((AggregateException)e).Flatten().InnerException
: e;
if (!this.IsHandlableException(exceptionToReport))
{
throw;
}
if (!this.ShouldRetry(tryCount, exceptionToReport, null))
{
return new InvocationResult(tryCount, exceptionToReport);
}
}
// Don't wait for the first retry, since it might just be transient.
// Don't wait after the last try. tryCount is 1-based, so last attempt is tryCount == maxRetries
if (tryCount > 1 && tryCount < this.maxRetries)
{
// Exponential backoff
double backOffSeconds = Math.Min(Math.Pow(this.exponentialBackoffBase, tryCount), MaxBackoffInSeconds);
// Timeout usually happens when the server is overloaded. If we give all machines the same timeout they will all make
// another request at approximately the same time causing the problem to happen again and again. To avoid that we
// introduce a random timeout. To avoid scaling it too high or too low, it is +- 10% of the average backoff
backOffSeconds *= .9 + (this.rng.NextDouble() * .2);
Thread.Sleep(TimeSpan.FromSeconds(backOffSeconds));
}
}
// This shouldn't be hit because ShouldRetry will cause a more useful message first.
return new InvocationResult(this.maxRetries, new Exception("Unexpected failure after retrying"));
}
private bool IsHandlableException(Exception e)
{
return
e is HttpException ||
e is HttpRequestException ||
e is IOException ||
e is RetryableException;
}
private bool ShouldRetry(int tryCount, Exception e, CallbackResult result)
{
bool willRetry = tryCount < this.maxRetries &&
(result == null || result.ShouldRetry);
if (e != null)
{
this.OnFailure(new ErrorEventArgs(e, tryCount, willRetry));
}
else
{
this.OnFailure(new ErrorEventArgs(result.Error, tryCount, willRetry));
}
return willRetry;
}
public class ErrorEventArgs
{
public ErrorEventArgs(Exception error, int tryCount, bool willRetry)
{
this.Error = error;
this.TryCount = tryCount;
this.WillRetry = willRetry;
}
public bool WillRetry { get; }
public int TryCount { get; }
public Exception Error { get; }
}
public class InvocationResult
{
public InvocationResult(int tryCount, bool succeeded, T result)
{
this.Attempts = tryCount;
this.Succeeded = true;
this.Result = result;
}
public InvocationResult(int tryCount, Exception error)
{
this.Attempts = tryCount;
this.Succeeded = false;
this.Error = error;
}
public InvocationResult(int tryCount, Exception error, T result)
: this(tryCount, error)
{
this.Result = result;
}
public T Result { get; }
public int Attempts { get; }
public bool Succeeded { get; }
public Exception Error { get; }
}
public class CallbackResult
{
public CallbackResult(T result)
{
this.Result = result;
}
public CallbackResult(Exception error, bool shouldRetry)
{
this.HasErrors = true;
this.Error = error;
this.ShouldRetry = shouldRetry;
}
public CallbackResult(Exception error, bool shouldRetry, T result)
: this(error, shouldRetry)
{
this.Result = result;
}
public bool HasErrors { get; }
public Exception Error { get; }
public bool ShouldRetry { get; }
public T Result { get; }
}
}
}

Просмотреть файл

@ -0,0 +1,11 @@
using System;
namespace GVFS.Common
{
public class RetryableException : Exception
{
public RetryableException(string message) : base(message)
{
}
}
}

Просмотреть файл

@ -0,0 +1,9 @@
namespace GVFS.Common
{
public enum ReturnCode
{
Success = 0,
RebootRequired = 2,
GenericError = 3
}
}

Просмотреть файл

@ -0,0 +1,55 @@
using System;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
namespace GVFS.Common
{
public static class SHA1Util
{
public static string SHA1HashStringForUTF8String(string s)
{
return HexStringFromBytes(SHA1ForUTF8String(s));
}
public static byte[] SHA1ForUTF8String(string s)
{
byte[] bytes = Encoding.UTF8.GetBytes(s);
using (SHA1 sha1 = SHA1.Create())
{
return sha1.ComputeHash(bytes);
}
}
/// <summary>
/// Returns a string representation of a byte array from the first
/// <param name="numBytes"/> bytes of the buffer.
/// </summary>
public static string HexStringFromBytes(byte[] buf, int numBytes = -1)
{
unsafe
{
numBytes = numBytes == -1 ? buf.Length : numBytes;
fixed (byte* unsafeBuf = buf)
{
int charIndex = 0;
byte* currentByte = unsafeBuf;
char[] chars = new char[numBytes * 2];
for (int i = 0; i < numBytes; i++)
{
char first = (char)(((*currentByte >> 4) & 0x0F) + 0x30);
char second = (char)((*currentByte & 0x0F) + 0x30);
chars[charIndex++] = first >= 0x3A ? (char)(first + 0x27) : first;
chars[charIndex++] = second >= 0x3A ? (char)(second + 0x27) : second;
currentByte++;
}
return new string(chars);
}
}
}
}
}

Просмотреть файл

@ -0,0 +1,62 @@
using System.IO;
namespace GVFS.Common
{
public class StreamUtil
{
/// <summary>
/// .NET default buffer size <see cref="Stream.CopyTo"/> uses as of 8/30/16
/// </summary>
public const int DefaultCopyBufferSize = 81920;
/// <summary>
/// Copies all bytes from the source stream to the destination stream. This is an exact copy
/// of Stream.CopyTo(), but can uses the supplied buffer instead of allocating a new one.
/// </summary>
/// <remarks>
/// As of .NET 4.6, each call to Stream.CopyTo() allocates a new 80K byte[] buffer, which
/// consumes many more resources than reusing one we already have if the scenario allows it.
/// </remarks>
/// <param name="source">Source stream to copy from</param>
/// <param name="destination">Destination stream to copy to</param>
/// <param name="buffer">
/// Shared buffer to use. If null, we allocate one with the same size .NET would otherwise use.
/// </param>
public static void CopyToWithBuffer(Stream source, Stream destination, byte[] buffer = null)
{
buffer = buffer ?? new byte[DefaultCopyBufferSize];
int read;
while ((read = source.Read(buffer, 0, buffer.Length)) != 0)
{
destination.Write(buffer, 0, read);
}
}
/// <summary>
/// Call <see cref="Stream.Read"/> until either <paramref name="count"/> bytes are read or
/// the end of <paramref name="stream"/> is reached.
/// </summary>
/// <param name="buf">Buffer to read bytes into.</param>
/// <param name="offset">Offset in <paramref name="buf"/> to start reading into.</param>
/// <param name="count">Maximum number of bytes to read.</param>
/// <returns>
/// Number of bytes read, may be less than <paramref name="count"/> if end was reached.
/// </returns>
public static int TryReadGreedy(Stream stream, byte[] buf, int offset, int count)
{
int totalRead = 0;
while (totalRead < count)
{
int read = stream.Read(buf, offset + totalRead, count - totalRead);
if (read == 0)
{
break;
}
totalRead += read;
}
return totalRead;
}
}
}

Просмотреть файл

@ -0,0 +1,52 @@
using System;
using System.Threading.Tasks;
namespace GVFS.Common
{
public static class TaskExtensions
{
public static void Timeout<TTimeoutException>(this Task self, int timeoutMs)
where TTimeoutException : TimeoutException, new()
{
if (!self.Wait(timeoutMs))
{
throw new TTimeoutException();
}
}
public static T Timeout<T, TTimeoutException>(this Task<T> self, int timeoutMs)
where TTimeoutException : TimeoutException, new()
{
if (!self.Wait(timeoutMs))
{
throw new TTimeoutException();
}
return self.Result;
}
public static async Task TimeoutAsync<TTimeoutException>(this Task self, int timeoutMs)
where TTimeoutException : TimeoutException, new()
{
Task timeout = Task.Delay(timeoutMs);
Task completedFirst = await Task.WhenAny(timeout, self);
if (timeout == completedFirst)
{
throw new TTimeoutException();
}
}
public static async Task<T> TimeoutAsync<T, TTimeoutException>(this Task<T> self, int timeoutMs)
where TTimeoutException : TimeoutException, new()
{
Task timeout = Task.Delay(timeoutMs);
Task completedFirst = await Task.WhenAny(timeout, self);
if (timeout == completedFirst)
{
throw new TTimeoutException();
}
return await self;
}
}
}

Просмотреть файл

@ -0,0 +1,18 @@
using System;
using Microsoft.Diagnostics.Tracing;
namespace GVFS.Common.Tracing
{
public class ConsoleEventListener : InProcEventListener
{
public ConsoleEventListener(EventLevel maxVerbosity, Keywords keywordFilter)
: base(maxVerbosity, keywordFilter)
{
}
public override void RecordMessage(string message)
{
Console.WriteLine(message);
}
}
}

Просмотреть файл

@ -0,0 +1,10 @@
using System.Collections.Generic;
namespace GVFS.Common.Tracing
{
// This is a convenience class to make code around event metadata look nicer.
// It's more obvious to see EventMetadata than Dictionary<string, object> everywhere.
public class EventMetadata : Dictionary<string, object>
{
}
}

Просмотреть файл

@ -0,0 +1,26 @@
using System;
using Microsoft.Diagnostics.Tracing;
namespace GVFS.Common.Tracing
{
public interface ITracer : IDisposable
{
ITracer StartActivity(string activityName, EventLevel level);
ITracer StartActivity(string activityName, EventLevel level, EventMetadata metadata);
void RelatedEvent(EventLevel level, string eventName, EventMetadata metadata);
void RelatedEvent(EventLevel level, string eventName, EventMetadata metadata, Keywords keywords);
void RelatedError(EventMetadata metadata);
void RelatedError(EventMetadata metadata, Keywords keywords);
void RelatedError(string message);
void RelatedError(string format, params object[] args);
void Stop(EventMetadata metadata);
}
}

Просмотреть файл

@ -0,0 +1,55 @@
using Microsoft.Diagnostics.Tracing;
using System;
using System.Text;
namespace GVFS.Common.Tracing
{
public abstract class InProcEventListener : EventListener
{
private EventLevel maxVerbosity;
private EventKeywords keywordFilter;
public InProcEventListener(EventLevel maxVerbosity, Keywords keywordFilter)
{
this.maxVerbosity = maxVerbosity;
this.keywordFilter = (EventKeywords)keywordFilter;
}
public abstract void RecordMessage(string message);
protected override void OnEventWritten(EventWrittenEventArgs eventData)
{
if (!this.IsEnabled(eventData.Level, eventData.Keywords))
{
return;
}
StringBuilder eventLine = new StringBuilder();
eventLine.AppendFormat("[{0}] {1}", DateTime.Now, eventData.EventName);
if (eventData.Opcode != 0)
{
eventLine.AppendFormat(" ({0})", eventData.Opcode);
}
if (eventData.Payload != null)
{
eventLine.Append(":");
for (int i = 0; i < eventData.PayloadNames.Count; i++)
{
// Space prefix avoids a string.Join.
eventLine.AppendFormat(" {0}: {1}", eventData.PayloadNames[i], eventData.Payload[i]);
}
}
this.RecordMessage(eventLine.ToString());
}
protected bool IsEnabled(EventLevel level, EventKeywords keyword)
{
return this.keywordFilter != (EventKeywords)Keywords.None &&
this.maxVerbosity >= level &&
this.keywordFilter.HasFlag(keyword);
}
}
}

Просмотреть файл

@ -0,0 +1,266 @@
using Microsoft.Diagnostics.Tracing;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Diagnostics;
namespace GVFS.Common.Tracing
{
public class JsonEtwTracer : ITracer
{
public const string NetworkErrorEventName = "NetworkError";
private string activityName;
private Guid parentActivityId;
private Guid activityId;
private bool stopped = false;
private Stopwatch duration = Stopwatch.StartNew();
private EventLevel startStopLevel;
private List<InProcEventListener> listeners;
public JsonEtwTracer(string providerName, string activityName)
: this(
new EventSource(providerName, EventSourceSettings.EtwSelfDescribingEventFormat),
Guid.Empty,
activityName,
EventLevel.Informational)
{
this.listeners = new List<InProcEventListener>();
}
private JsonEtwTracer(
EventSource eventSource,
Guid parentActivityId,
string activityName,
EventLevel startStopLevel)
{
this.EvtSource = eventSource;
this.parentActivityId = parentActivityId;
this.activityName = activityName;
this.startStopLevel = startStopLevel;
this.activityId = Guid.NewGuid();
}
public EventSource EvtSource { get; }
public static string GetNameFromEnlistmentPath(string enlistmentRootPath)
{
return "Microsoft-GVFS_" + enlistmentRootPath.ToUpper().Replace(':', '_');
}
public void AddConsoleEventListener(EventLevel maxVerbosity, Keywords keywordFilter)
{
this.AddEventListener(
new ConsoleEventListener(maxVerbosity, keywordFilter),
maxVerbosity);
}
public void AddLogFileEventListener(string logFilePath, EventLevel maxVerbosity, Keywords keywordFilter)
{
this.AddEventListener(
new LogFileEventListener(logFilePath, maxVerbosity, keywordFilter),
maxVerbosity);
}
public void Dispose()
{
this.Stop(null);
// If we have no parent, then we are the root tracer and should dispose our eventsource.
if (this.parentActivityId == Guid.Empty)
{
if (this.listeners != null)
{
foreach (InProcEventListener listener in this.listeners)
{
listener.Dispose();
}
this.listeners = null;
}
this.EvtSource.Dispose();
}
}
public virtual void RelatedEvent(EventLevel level, string eventName, EventMetadata metadata)
{
this.RelatedEvent(level, eventName, metadata, Keywords.None);
}
public virtual void RelatedEvent(EventLevel level, string eventName, EventMetadata metadata, Keywords keyword)
{
EventSourceOptions options = this.CreateDefaultOptions(level, keyword);
this.WriteEvent(eventName, metadata, ref options);
}
public virtual void RelatedError(EventMetadata metadata)
{
this.RelatedError(metadata, Keywords.None);
}
public virtual void RelatedError(EventMetadata metadata, Keywords keywords)
{
this.RelatedEvent(EventLevel.Error, GetCategorizedErrorEventName(keywords), metadata, keywords);
}
public virtual void RelatedError(string message)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("ErrorMessage", message);
this.RelatedError(metadata);
}
public virtual void RelatedError(string format, params object[] args)
{
this.RelatedError(string.Format(format, args));
}
public void Stop(EventMetadata metadata)
{
if (this.stopped)
{
return;
}
this.duration.Stop();
this.stopped = true;
EventSourceOptions options = this.CreateDefaultOptions(this.startStopLevel, Keywords.None);
options.Opcode = EventOpcode.Stop;
metadata = metadata ?? new EventMetadata();
metadata.Add("DurationMs", this.duration.ElapsedMilliseconds);
this.WriteEvent(this.activityName, metadata, ref options);
}
public ITracer StartActivity(string childActivityName, EventLevel startStopLevel)
{
return this.StartActivity(childActivityName, startStopLevel, null);
}
public ITracer StartActivity(string childActivityName, EventLevel startStopLevel, EventMetadata startMetadata)
{
JsonEtwTracer subTracer = new JsonEtwTracer(this.EvtSource, this.activityId, childActivityName, startStopLevel);
subTracer.WriteStartEvent(startMetadata);
return subTracer;
}
public void WriteStartEvent(
string enlistmentRoot,
string repoUrl,
string cacheServerUrl,
EventMetadata additionalMetadata = null)
{
EventMetadata metadata = new EventMetadata();
metadata.Add("Version", ProcessHelper.GetCurrentProcessVersion());
if (enlistmentRoot != null)
{
metadata.Add("EnlistmentRoot", enlistmentRoot);
}
if (repoUrl != null)
{
metadata.Add("Remote", Uri.EscapeUriString(repoUrl));
}
if (cacheServerUrl != null)
{
// Changing this key to CacheServerUrl will mess with our telemetry, so it stays for historical reasons
metadata.Add("ObjectsEndpoint", Uri.EscapeUriString(cacheServerUrl));
}
if (additionalMetadata != null)
{
foreach (string key in additionalMetadata.Keys)
{
metadata.Add(key, additionalMetadata[key]);
}
}
this.WriteStartEvent(metadata);
}
public void WriteStartEvent(EventMetadata metadata)
{
EventSourceOptions options = this.CreateDefaultOptions(this.startStopLevel, Keywords.None);
options.Opcode = EventOpcode.Start;
this.WriteEvent(this.activityName, metadata, ref options);
}
private static string GetCategorizedErrorEventName(Keywords keywords)
{
switch (keywords)
{
case Keywords.Network: return NetworkErrorEventName;
default: return "Error";
}
}
private void WriteEvent(string eventName, EventMetadata metadata, ref EventSourceOptions options)
{
if (metadata != null)
{
JsonPayload payload = new JsonPayload(metadata);
EventSource.SetCurrentThreadActivityId(this.activityId);
this.EvtSource.Write(eventName, ref options, ref this.activityId, ref this.parentActivityId, ref payload);
}
else
{
EmptyStruct payload = new EmptyStruct();
EventSource.SetCurrentThreadActivityId(this.activityId);
this.EvtSource.Write(eventName, ref options, ref this.activityId, ref this.parentActivityId, ref payload);
}
}
private EventSourceOptions CreateDefaultOptions(EventLevel level, Keywords keywords)
{
EventSourceOptions options = new EventSourceOptions();
options.Keywords = (EventKeywords)keywords;
options.Level = level;
return options;
}
private void AddEventListener(InProcEventListener listener, EventLevel maxVerbosity)
{
if (this.listeners == null)
{
throw new InvalidOperationException("You can only register a listener on the root tracer object");
}
if (maxVerbosity >= EventLevel.Verbose)
{
listener.RecordMessage(string.Format("ETW Provider name: {0} ({1})", this.EvtSource.Name, this.EvtSource.Guid));
listener.RecordMessage("Activity Id: " + this.activityId);
}
listener.EnableEvents(this.EvtSource, maxVerbosity);
this.listeners.Add(listener);
}
// Needed to pass relatedId without metadata
[EventData]
private struct EmptyStruct
{
}
[EventData]
private struct JsonPayload
{
public JsonPayload(object serializableObject)
{
this.Json = JsonConvert.SerializeObject(serializableObject);
}
[EventField]
public string Json { get; }
}
}
}

Просмотреть файл

@ -0,0 +1,15 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace GVFS.Common.Tracing
{
public enum Keywords : long
{
None = 1 << 0,
Network = 1 << 1,
Any = ~0,
}
}

Просмотреть файл

@ -0,0 +1,41 @@
using Microsoft.Diagnostics.Tracing;
using System.IO;
namespace GVFS.Common.Tracing
{
public class LogFileEventListener : InProcEventListener
{
private FileStream logFile;
private TextWriter writer;
public LogFileEventListener(string logFilePath, EventLevel maxVerbosity, Keywords keywordFilter)
: base(maxVerbosity, keywordFilter)
{
this.logFile = File.Open(logFilePath, FileMode.CreateNew, FileAccess.ReadWrite, FileShare.Read);
this.writer = StreamWriter.Synchronized(new StreamWriter(this.logFile));
}
public override void RecordMessage(string message)
{
this.writer.WriteLine(message);
this.writer.Flush();
}
public override void Dispose()
{
if (this.writer != null)
{
this.writer.Dispose();
this.writer = null;
}
if (this.logFile != null)
{
this.logFile.Dispose();
this.logFile = null;
}
base.Dispose();
}
}
}

Просмотреть файл

@ -0,0 +1,145 @@
using Microsoft.Win32.SafeHandles;
using System;
using System.Diagnostics;
using System.Runtime.InteropServices;
namespace GVFS.Common
{
public class WindowsProcessJob : IDisposable
{
private SafeJobHandle jobHandle;
private bool disposed;
public WindowsProcessJob(Process process)
{
IntPtr newHandle = Native.CreateJobObject(IntPtr.Zero, null);
if (newHandle == IntPtr.Zero)
{
throw new InvalidOperationException("Unable to create a job. Error: " + Marshal.GetLastWin32Error());
}
this.jobHandle = new SafeJobHandle(newHandle);
Native.JOBOBJECT_BASIC_LIMIT_INFORMATION info = new Native.JOBOBJECT_BASIC_LIMIT_INFORMATION
{
LimitFlags = 0x2000 // JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE
};
Native.JOBOBJECT_EXTENDED_LIMIT_INFORMATION extendedInfo = new Native.JOBOBJECT_EXTENDED_LIMIT_INFORMATION
{
BasicLimitInformation = info
};
int length = Marshal.SizeOf(typeof(Native.JOBOBJECT_EXTENDED_LIMIT_INFORMATION));
if (!Native.SetInformationJobObject(this.jobHandle, Native.JobObjectInfoType.ExtendedLimitInformation, ref extendedInfo, (uint)length))
{
throw new InvalidOperationException("Unable to configure the job. Error: " + Marshal.GetLastWin32Error());
}
if (!Native.AssignProcessToJobObject(this.jobHandle, process.Handle))
{
throw new InvalidOperationException("Unable to add process to the job. Error: " + Marshal.GetLastWin32Error());
}
}
public void Dispose()
{
this.Dispose(true);
GC.SuppressFinalize(this);
}
private void Dispose(bool disposing)
{
if (!this.disposed)
{
this.jobHandle.Dispose();
this.jobHandle = null;
this.disposed = true;
}
}
private static class Native
{
public enum JobObjectInfoType
{
AssociateCompletionPortInformation = 7,
BasicLimitInformation = 2,
BasicUIRestrictions = 4,
EndOfJobTimeInformation = 6,
ExtendedLimitInformation = 9,
SecurityLimitInformation = 5,
GroupInformation = 11
}
[DllImport("kernel32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
public static extern IntPtr CreateJobObject(IntPtr attributes, string name);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool SetInformationJobObject(SafeJobHandle jobHandle, JobObjectInfoType infoType, [In] ref JOBOBJECT_EXTENDED_LIMIT_INFORMATION jobObjectInfo, uint jobObjectInfoLength);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool AssignProcessToJobObject(SafeJobHandle jobHandle, IntPtr processHandle);
[DllImport("kernel32.dll", SetLastError = true)]
public static extern bool CloseHandle(IntPtr handle);
[StructLayout(LayoutKind.Sequential)]
public struct IO_COUNTERS
{
public ulong ReadOperationCount;
public ulong WriteOperationCount;
public ulong OtherOperationCount;
public ulong ReadTransferCount;
public ulong WriteTransferCount;
public ulong OtherTransferCount;
}
[StructLayout(LayoutKind.Sequential)]
public struct JOBOBJECT_BASIC_LIMIT_INFORMATION
{
public long PerProcessUserTimeLimit;
public long PerJobUserTimeLimit;
public uint LimitFlags;
public UIntPtr MinimumWorkingSetSize;
public UIntPtr MaximumWorkingSetSize;
public uint ActiveProcessLimit;
public UIntPtr Affinity;
public uint PriorityClass;
public uint SchedulingClass;
}
[StructLayout(LayoutKind.Sequential)]
public struct SECURITY_ATTRIBUTES
{
public uint Length;
public IntPtr SecurityDescriptor;
public int InheritHandle;
}
[StructLayout(LayoutKind.Sequential)]
public struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION
{
public JOBOBJECT_BASIC_LIMIT_INFORMATION BasicLimitInformation;
public IO_COUNTERS IoInfo;
public UIntPtr ProcessMemoryLimit;
public UIntPtr JobMemoryLimit;
public UIntPtr PeakProcessMemoryUsed;
public UIntPtr PeakJobMemoryUsed;
}
}
private sealed class SafeJobHandle : SafeHandleZeroOrMinusOneIsInvalid
{
public SafeJobHandle(IntPtr handle) : base(true)
{
this.SetHandle(handle);
}
protected override bool ReleaseHandle()
{
return Native.CloseHandle(this.handle);
}
}
}
}

Просмотреть файл

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="ManagedEsent" version="1.9.4" targetFramework="net452" />
<package id="Microsoft.Database.Collections.Generic" version="1.9.4" targetFramework="net452" />
<package id="Microsoft.Database.Isam" version="1.9.4" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventRegister" version="1.1.28" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventSource" version="1.1.28" targetFramework="net452" />
<package id="Microsoft.Diagnostics.Tracing.EventSource.Redist" version="1.1.28" targetFramework="net452" />
<package id="Newtonsoft.Json" version="7.0.1" targetFramework="net452" />
<package id="StyleCop.Error.MSBuild" version="1.0.0" targetFramework="net452" />
<package id="StyleCop.MSBuild" version="4.7.54.0" targetFramework="net452" developmentDependency="true" />
</packages>

Просмотреть файл

@ -0,0 +1,7 @@
namespace GVFS.FunctionalTests.Category
{
public static class CategoryConstants
{
public const string FastFetch = "FastFetch";
}
}

Просмотреть файл

@ -0,0 +1,224 @@
using GVFS.FunctionalTests.Properties;
using GVFS.Tests.Should;
using NUnit.Framework;
using System;
using System.IO;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public class BashRunner : ShellRunner
{
private static string[] fileNotFoundMessages = new string[]
{
"cannot stat",
"cannot remove",
"No such file or directory"
};
private static string[] invalidMovePathMessages = new string[]
{
"cannot move",
"No such file or directory"
};
private static string[] moveDirectoryNotSupportedMessage = new string[]
{
"Function not implemented"
};
private static string[] permissionDeniedMessage = new string[]
{
"Permission denied"
};
private readonly string pathToBash;
public BashRunner()
{
if (File.Exists(Settings.Default.PathToBash))
{
this.pathToBash = Settings.Default.PathToBash;
}
else
{
this.pathToBash = "bash.exe";
}
}
protected override string FileName
{
get
{
return this.pathToBash;
}
}
public override bool FileExists(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
string command = string.Format("-c \"[ -f {0} ] && echo {1} || echo {2}\"", bashPath, ShellRunner.SuccessOutput, ShellRunner.FailureOutput);
string output = this.RunProcess(command).Trim();
return output.Equals(ShellRunner.SuccessOutput, StringComparison.InvariantCulture);
}
public override string MoveFile(string sourcePath, string targetPath)
{
string sourceBashPath = this.ConvertWinPathToBashPath(sourcePath);
string targetBashPath = this.ConvertWinPathToBashPath(targetPath);
return this.RunProcess(string.Format("-c \"mv {0} {1}\"", sourceBashPath, targetBashPath));
}
public override void MoveFileShouldFail(string sourcePath, string targetPath)
{
// BashRunner does nothing special when a failure is expected, so just confirm source file is still present
this.MoveFile(sourcePath, targetPath);
this.FileExists(sourcePath).ShouldEqual(true);
}
public override void MoveFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContainOneOf(fileNotFoundMessages);
}
public override string ReplaceFile(string sourcePath, string targetPath)
{
string sourceBashPath = this.ConvertWinPathToBashPath(sourcePath);
string targetBashPath = this.ConvertWinPathToBashPath(targetPath);
return this.RunProcess(string.Format("-c \"mv -f {0} {1}\"", sourceBashPath, targetBashPath));
}
public override string DeleteFile(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
return this.RunProcess(string.Format("-c \"rm {0}\"", bashPath));
}
public override string ReadAllText(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
string output = this.RunProcess(string.Format("-c \"cat {0}\"", bashPath));
// Bash sometimes sticks a trailing "\n" at the end of the output that we need to remove
// Until we can figure out why we cannot use this runner with files that have trailing newlines
if (output.Length > 0 &&
output.Substring(output.Length - 1).Equals("\n", StringComparison.InvariantCultureIgnoreCase) &&
!(output.Length > 1 &&
output.Substring(output.Length - 2).Equals("\r\n", StringComparison.InvariantCultureIgnoreCase)))
{
output = output.Remove(output.Length - 1, 1);
}
return output;
}
public override void AppendAllText(string path, string contents)
{
string bashPath = this.ConvertWinPathToBashPath(path);
this.RunProcess(string.Format("-c \"echo -n \\\"{0}\\\" >> {1}\"", contents, bashPath));
}
public override void CreateEmptyFile(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
this.RunProcess(string.Format("-c \"touch {0}\"", bashPath));
}
public override void WriteAllText(string path, string contents)
{
string bashPath = this.ConvertWinPathToBashPath(path);
this.RunProcess(string.Format("-c \"echo \\\"{0}\\\" > {1}\"", contents, bashPath));
}
public override void WriteAllTextShouldFail<ExceptionType>(string path, string contents)
{
// BashRunner does nothing special when a failure is expected
this.WriteAllText(path, contents);
}
public override bool DirectoryExists(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
string output = this.RunProcess(string.Format("-c \"[ -d {0} ] && echo {1} || echo {2}\"", bashPath, ShellRunner.SuccessOutput, ShellRunner.FailureOutput)).Trim();
return output.Equals(ShellRunner.SuccessOutput, StringComparison.InvariantCulture);
}
public override void MoveDirectory(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath);
}
public override void MoveDirectory_RequestShouldNotBeSupported(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(moveDirectoryNotSupportedMessage);
}
public override void MoveDirectory_TargetShouldBeInvalid(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(invalidMovePathMessages);
}
public override void CreateDirectory(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
this.RunProcess(string.Format("-c \"mkdir {0}\"", bashPath));
}
public override string DeleteDirectory(string path)
{
string bashPath = this.ConvertWinPathToBashPath(path);
return this.RunProcess(string.Format("-c \"rm -rf {0}\"", bashPath));
}
public override void ReplaceFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.ReplaceFile(sourcePath, targetPath).ShouldContainOneOf(fileNotFoundMessages);
}
public override void DeleteFile_FileShouldNotBeFound(string path)
{
this.DeleteFile(path).ShouldContainOneOf(fileNotFoundMessages);
}
public override void DeleteFile_AccessShouldBeDenied(string path)
{
this.DeleteFile(path).ShouldContain(permissionDeniedMessage);
}
public override void ReadAllText_FileShouldNotBeFound(string path)
{
this.ReadAllText(path).ShouldContainOneOf(fileNotFoundMessages);
}
public override void DeleteDirectory_DirectoryShouldNotBeFound(string path)
{
// Delete directory silently succeeds when deleting a non-existent path
this.DeleteDirectory(path);
}
public override void DeleteDirectory_ShouldBeBlockedByProcess(string path)
{
Assert.Fail("Unlike the other runners, bash.exe does not check folder handle before recusively deleting");
}
private string ConvertWinPathToBashPath(string winPath)
{
string bashPath = string.Concat("/", winPath);
bashPath = bashPath.Replace(":\\", "/");
bashPath = bashPath.Replace('\\', '/');
return bashPath;
}
}
}

Просмотреть файл

@ -0,0 +1,185 @@
using GVFS.Tests.Should;
using System;
using System.IO;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public class CmdRunner : ShellRunner
{
private const string ProcessName = "CMD.exe";
private static string[] missingFileErrorMessages = new string[]
{
"The system cannot find the file specified.",
"The system cannot find the path specified.",
"Could Not Find"
};
private static string[] moveDirectoryFailureMessage = new string[]
{
"0 dir(s) moved"
};
private static string[] fileUsedByAnotherProcessMessage = new string[]
{
"The process cannot access the file because it is being used by another process"
};
protected override string FileName
{
get
{
return ProcessName;
}
}
public override bool FileExists(string path)
{
if (this.DirectoryExists(path))
{
return false;
}
string output = this.RunProcess(string.Format("/C if exist {0} (echo {1}) else (echo {2})", path, ShellRunner.SuccessOutput, ShellRunner.FailureOutput)).Trim();
return output.Equals(ShellRunner.SuccessOutput, StringComparison.InvariantCulture);
}
public override string MoveFile(string sourcePath, string targetPath)
{
return this.RunProcess(string.Format("/C move {0} {1}", sourcePath, targetPath));
}
public override void MoveFileShouldFail(string sourcePath, string targetPath)
{
// CmdRunner does nothing special when a failure is expected
this.MoveFile(sourcePath, targetPath);
}
public override void MoveFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContainOneOf(missingFileErrorMessages);
}
public override string ReplaceFile(string sourcePath, string targetPath)
{
return this.RunProcess(string.Format("/C move /Y {0} {1}", sourcePath, targetPath));
}
public override string DeleteFile(string path)
{
return this.RunProcess(string.Format("/C del {0}", path));
}
public override string ReadAllText(string path)
{
return this.RunProcess(string.Format("/C type {0}", path));
}
public override void CreateEmptyFile(string path)
{
this.RunProcess(string.Format("/C type NUL > {0}", path));
}
public override void AppendAllText(string path, string contents)
{
// Use echo|set /p with "" to avoid adding any trailing whitespace or newline
// to the contents
this.RunProcess(string.Format("/C echo|set /p =\"{0}\" >> {1}", contents, path));
}
public override void WriteAllText(string path, string contents)
{
// Use echo|set /p with "" to avoid adding any trailing whitespace or newline
// to the contents
this.RunProcess(string.Format("/C echo|set /p =\"{0}\" > {1}", contents, path));
}
public override void WriteAllTextShouldFail<ExceptionType>(string path, string contents)
{
// CmdRunner does nothing special when a failure is expected
this.WriteAllText(path, contents);
}
public override bool DirectoryExists(string path)
{
string parentDirectory = Path.GetDirectoryName(path);
string targetName = Path.GetFileName(path);
string output = this.RunProcess(string.Format("/C dir /A:d /B {0}", parentDirectory));
string[] directories = output.Split(new string[] { "\r\n" }, StringSplitOptions.RemoveEmptyEntries);
foreach (string directory in directories)
{
if (directory.Equals(targetName, StringComparison.OrdinalIgnoreCase))
{
return true;
}
}
return false;
}
public override void CreateDirectory(string path)
{
this.RunProcess(string.Format("/C mkdir {0}", path));
}
public override string DeleteDirectory(string path)
{
return this.RunProcess(string.Format("/C rmdir /q /s {0}", path));
}
public override void MoveDirectory(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath);
}
public override void MoveDirectory_RequestShouldNotBeSupported(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(moveDirectoryFailureMessage);
}
public override void MoveDirectory_TargetShouldBeInvalid(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(moveDirectoryFailureMessage);
}
public string RunCommand(string command)
{
return this.RunProcess(string.Format("/C {0}", command));
}
public override void ReplaceFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.ReplaceFile(sourcePath, targetPath).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteFile_FileShouldNotBeFound(string path)
{
this.DeleteFile(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteFile_AccessShouldBeDenied(string path)
{
// CMD does not report any error messages when access is denied, so just confirm the file still exists
this.DeleteFile(path);
this.FileExists(path).ShouldEqual(true);
}
public override void ReadAllText_FileShouldNotBeFound(string path)
{
this.ReadAllText(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteDirectory_DirectoryShouldNotBeFound(string path)
{
this.DeleteDirectory(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteDirectory_ShouldBeBlockedByProcess(string path)
{
this.DeleteDirectory(path).ShouldContain(fileUsedByAnotherProcessMessage);
}
}
}

Просмотреть файл

@ -0,0 +1,113 @@
using System;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public abstract class FileSystemRunner
{
/// <summary>
/// String that identifies which list to use when running tests
/// </summary>
public const string TestRunners = "Runners";
private static FileSystemRunner defaultRunner = new CmdRunner();
/// <summary>
/// Runners to use when the debugger is not attached
/// </summary>
private static object[] allRunners =
{
new object[] { new SystemIORunner() },
new object[] { new CmdRunner() },
new object[] { new PowerShellRunner() },
new object[] { new BashRunner() },
};
/// <summary>
/// Runners to use when the debugger is attached
/// </summary>
private static object[] debugRunners =
{
new object[] { defaultRunner }
};
public static bool UseAllRunners { get; set; }
public static object[] Runners
{
get { return UseAllRunners ? allRunners : debugRunners; }
}
/// <summary>
/// Default runner to use (for tests that do not need to be run with multiple runners)
/// </summary>
public static FileSystemRunner DefaultRunner
{
get { return defaultRunner; }
}
// File methods
public abstract bool FileExists(string path);
public abstract string MoveFile(string sourcePath, string targetPath);
/// <summary>
/// Attempt to move the specified file to the specifed target path. By calling this method the caller is
/// indicating that they expect the move to fail. However, the caller is responsible for verifying that
/// the move failed.
/// </summary>
/// <param name="sourcePath">Path to existing file</param>
/// <param name="targetPath">Path to target file (target of the move)</param>
public abstract void MoveFileShouldFail(string sourcePath, string targetPath);
public abstract void MoveFile_FileShouldNotBeFound(string sourcePath, string targetPath);
public abstract string ReplaceFile(string sourcePath, string targetPath);
public abstract void ReplaceFile_FileShouldNotBeFound(string sourcePath, string targetPath);
public abstract string DeleteFile(string path);
public abstract void DeleteFile_FileShouldNotBeFound(string path);
public abstract void DeleteFile_AccessShouldBeDenied(string path);
public abstract string ReadAllText(string path);
public abstract void ReadAllText_FileShouldNotBeFound(string path);
public abstract void CreateEmptyFile(string path);
/// <summary>
/// Write the specified contents to the specified file. By calling this method the caller is
/// indicating that they expect the write to succeed. However, the caller is responsible for verifying that
/// the write succeeded.
/// </summary>
/// <param name="path">Path to file</param>
/// <param name="contents">File contents</param>
public abstract void WriteAllText(string path, string contents);
/// <summary>
/// Append the specified contents to the specified file. By calling this method the caller is
/// indicating that they expect the write to succeed. However, the caller is responsible for verifying that
/// the write succeeded.
/// </summary>
/// <param name="path">Path to file</param>
/// <param name="contents">File contents</param>
public abstract void AppendAllText(string path, string contents);
/// <summary>
/// Attempt to write the specified contents to the specified file. By calling this method the caller is
/// indicating that they expect the write to fail. However, the caller is responsible for verifying that
/// the write failed.
/// </summary>
/// <typeparam name="ExceptionType">Expected type of exception to be thrown</typeparam>
/// <param name="path">Path to file</param>
/// <param name="contents">File contents</param>
public abstract void WriteAllTextShouldFail<ExceptionType>(string path, string contents) where ExceptionType : Exception;
// Directory methods
public abstract bool DirectoryExists(string path);
public abstract void MoveDirectory(string sourcePath, string targetPath);
public abstract void MoveDirectory_RequestShouldNotBeSupported(string sourcePath, string targetPath);
public abstract void MoveDirectory_TargetShouldBeInvalid(string sourcePath, string targetPath);
public abstract void CreateDirectory(string path);
/// <summary>
/// A recursive delete of a directory
/// </summary>
public abstract string DeleteDirectory(string path);
public abstract void DeleteDirectory_DirectoryShouldNotBeFound(string path);
public abstract void DeleteDirectory_ShouldBeBlockedByProcess(string path);
}
}

Просмотреть файл

@ -0,0 +1,192 @@
using GVFS.Tests.Should;
using System.IO;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public class PowerShellRunner : ShellRunner
{
private const string ProcessName = "powershell.exe";
private static string[] missingFileErrorMessages = new string[]
{
"Cannot find path"
};
private static string[] invalidPathErrorMessages = new string[]
{
"Could not find a part of the path"
};
private static string[] moveDirectoryNotSupportedMessage = new string[]
{
"The request is not supported."
};
private static string[] fileUsedByAnotherProcessMessage = new string[]
{
"The process cannot access the file because it is being used by another process"
};
private static string[] permissionDeniedMessage = new string[]
{
"PermissionDenied"
};
protected override string FileName
{
get
{
return ProcessName;
}
}
public override bool FileExists(string path)
{
string parentDirectory = Path.GetDirectoryName(path);
string targetName = Path.GetFileName(path);
// Use -force so that hidden items are returned as well
string command = string.Format("-Command \"&{{ Get-ChildItem -force {0} | where {{$_.Attributes -NotLike '*Directory*'}} | where {{$_.Name -eq '{1}' }} }}\"", parentDirectory, targetName);
string output = this.RunProcess(command).Trim();
if (output.Length == 0 || output.Contains("PathNotFound") || output.Contains("ItemNotFound"))
{
return false;
}
return true;
}
public override string MoveFile(string sourcePath, string targetPath)
{
return this.RunProcess(string.Format("-Command \"& {{ Move-Item {0} {1} }}\"", sourcePath, targetPath));
}
public override void MoveFileShouldFail(string sourcePath, string targetPath)
{
// PowerShellRunner does nothing special when a failure is expected
this.MoveFile(sourcePath, targetPath);
}
public override void MoveFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContainOneOf(missingFileErrorMessages);
}
public override string ReplaceFile(string sourcePath, string targetPath)
{
return this.RunProcess(string.Format("-Command \"& {{ Move-Item {0} {1} -force }}\"", sourcePath, targetPath));
}
public override string DeleteFile(string path)
{
return this.RunProcess(string.Format("-Command \"& {{ Remove-Item {0} }}\"", path));
}
public override string ReadAllText(string path)
{
string output = this.RunProcess(string.Format("-Command \"& {{ Get-Content -Raw {0} }}\"", path), errorMsgDelimeter: "\r\n");
// Get-Content insists on sticking a trailing "\r\n" at the end of the output that we need to remove
output.Length.ShouldBeAtLeast(2);
output.Substring(output.Length - 2).ShouldEqual("\r\n");
output = output.Remove(output.Length - 2, 2);
return output;
}
public override void AppendAllText(string path, string contents)
{
this.RunProcess(string.Format("-Command \"&{{ Out-File -FilePath {0} -InputObject '{1}' -Encoding ascii -Append -NoNewline}}\"", path, contents));
}
public override void CreateEmptyFile(string path)
{
this.RunProcess(string.Format("-Command \"&{{ New-Item -ItemType file {0}}}\"", path));
}
public override void WriteAllText(string path, string contents)
{
this.RunProcess(string.Format("-Command \"&{{ Out-File -FilePath {0} -InputObject '{1}' -Encoding ascii -NoNewline}}\"", path, contents));
}
public override void WriteAllTextShouldFail<ExceptionType>(string path, string contents)
{
// PowerShellRunner does nothing special when a failure is expected
this.WriteAllText(path, contents);
}
public override bool DirectoryExists(string path)
{
string command = string.Format("-Command \"&{{ Test-Path {0} -PathType Container }}\"", path);
string output = this.RunProcess(command).Trim();
if (output.Contains("True"))
{
return true;
}
return false;
}
public override void MoveDirectory(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath);
}
public override void MoveDirectory_RequestShouldNotBeSupported(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(moveDirectoryNotSupportedMessage);
}
public override void MoveDirectory_TargetShouldBeInvalid(string sourcePath, string targetPath)
{
this.MoveFile(sourcePath, targetPath).ShouldContain(invalidPathErrorMessages);
}
public override void CreateDirectory(string path)
{
this.RunProcess(string.Format("-Command \"&{{ New-Item {0} -type directory}}\"", path));
}
public override string DeleteDirectory(string path)
{
return this.RunProcess(string.Format("-Command \"&{{ Remove-Item -Force -Recurse {0} }}\"", path));
}
public override void ReplaceFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.ReplaceFile(sourcePath, targetPath).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteFile_FileShouldNotBeFound(string path)
{
this.DeleteFile(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteFile_AccessShouldBeDenied(string path)
{
this.DeleteFile(path).ShouldContain(permissionDeniedMessage);
}
public override void ReadAllText_FileShouldNotBeFound(string path)
{
this.ReadAllText(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteDirectory_DirectoryShouldNotBeFound(string path)
{
this.DeleteDirectory(path).ShouldContainOneOf(missingFileErrorMessages);
}
public override void DeleteDirectory_ShouldBeBlockedByProcess(string path)
{
this.DeleteDirectory(path).ShouldContain(fileUsedByAnotherProcessMessage);
}
protected override string RunProcess(string command, string errorMsgDelimeter = "")
{
return base.RunProcess("-NoProfile " + command, errorMsgDelimeter);
}
}
}

Просмотреть файл

@ -0,0 +1,27 @@
using GVFS.FunctionalTests.Tools;
using System.Diagnostics;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public abstract class ShellRunner : FileSystemRunner
{
protected const string SuccessOutput = "True";
protected const string FailureOutput = "False";
protected abstract string FileName { get; }
protected virtual string RunProcess(string arguments, string errorMsgDelimeter = "")
{
ProcessStartInfo startInfo = new ProcessStartInfo();
startInfo.UseShellExecute = false;
startInfo.RedirectStandardOutput = true;
startInfo.RedirectStandardError = true;
startInfo.CreateNoWindow = true;
startInfo.FileName = this.FileName;
startInfo.Arguments = arguments;
ProcessResult result = ProcessHelper.Run(startInfo, errorMsgDelimeter: errorMsgDelimeter);
return !string.IsNullOrEmpty(result.Output) ? result.Output : result.Errors;
}
}
}

Просмотреть файл

@ -0,0 +1,197 @@
using NUnit.Framework;
using System;
using System.Diagnostics;
using System.IO;
using System.Threading;
namespace GVFS.FunctionalTests.FileSystemRunners
{
public class SystemIORunner : FileSystemRunner
{
public static void RecursiveDelete(string path)
{
DirectoryInfo directory = new DirectoryInfo(path);
foreach (FileInfo file in directory.GetFiles())
{
file.Attributes = FileAttributes.Normal;
RetryOnException(() => file.Delete());
}
foreach (DirectoryInfo subDirectory in directory.GetDirectories())
{
SystemIORunner.RecursiveDelete(subDirectory.FullName);
}
RetryOnException(() => directory.Delete());
}
public override bool FileExists(string path)
{
return File.Exists(path);
}
public override string MoveFile(string sourcePath, string targetPath)
{
File.Move(sourcePath, targetPath);
return string.Empty;
}
public override void MoveFileShouldFail(string sourcePath, string targetPath)
{
if (Debugger.IsAttached)
{
throw new InvalidOperationException("MoveFileShouldFail should not be run with the debugger attached");
}
this.ShouldFail<IOException>(() => { this.MoveFile(sourcePath, targetPath); });
}
public override void MoveFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.ShouldFail<IOException>(() => { this.MoveFile(sourcePath, targetPath); });
}
public override string ReplaceFile(string sourcePath, string targetPath)
{
File.Replace(sourcePath, targetPath, null);
return string.Empty;
}
public override void ReplaceFile_FileShouldNotBeFound(string sourcePath, string targetPath)
{
this.ShouldFail<IOException>(() => { this.ReplaceFile(sourcePath, targetPath); });
}
public override string DeleteFile(string path)
{
File.Delete(path);
return string.Empty;
}
public override void DeleteFile_FileShouldNotBeFound(string path)
{
// Delete file silently succeeds when file is non-existent
this.DeleteFile(path);
}
public override void DeleteFile_AccessShouldBeDenied(string path)
{
this.ShouldFail<UnauthorizedAccessException>(() => { this.DeleteFile(path); });
}
public override string ReadAllText(string path)
{
return File.ReadAllText(path);
}
public override void CreateEmptyFile(string path)
{
using (FileStream fs = File.Create(path))
{
}
}
public override void WriteAllText(string path, string contents)
{
File.WriteAllText(path, contents);
}
public override void AppendAllText(string path, string contents)
{
File.AppendAllText(path, contents);
}
public override void WriteAllTextShouldFail<ExceptionType>(string path, string contents)
{
if (Debugger.IsAttached)
{
throw new InvalidOperationException("WriteAllTextShouldFail should not be run with the debugger attached");
}
this.ShouldFail<ExceptionType>(() => { this.WriteAllText(path, contents); });
}
public override bool DirectoryExists(string path)
{
return Directory.Exists(path);
}
public override void MoveDirectory(string sourcePath, string targetPath)
{
Directory.Move(sourcePath, targetPath);
}
public override void MoveDirectory_RequestShouldNotBeSupported(string sourcePath, string targetPath)
{
if (Debugger.IsAttached)
{
throw new InvalidOperationException("MoveDirectory_RequestShouldNotBeSupported should not be run with the debugger attached");
}
Assert.Catch<IOException>(() => this.MoveDirectory(sourcePath, targetPath));
}
public override void MoveDirectory_TargetShouldBeInvalid(string sourcePath, string targetPath)
{
if (Debugger.IsAttached)
{
throw new InvalidOperationException("MoveDirectory_TargetShouldBeInvalid should not be run with the debugger attached");
}
Assert.Catch<IOException>(() => this.MoveDirectory(sourcePath, targetPath));
}
public override void CreateDirectory(string path)
{
Directory.CreateDirectory(path);
}
public override string DeleteDirectory(string path)
{
SystemIORunner.RecursiveDelete(path);
return string.Empty;
}
public override void DeleteDirectory_DirectoryShouldNotBeFound(string path)
{
this.ShouldFail<IOException>(() => { this.DeleteDirectory(path); });
}
public override void DeleteDirectory_ShouldBeBlockedByProcess(string path)
{
Assert.Fail("DeleteDirectory_ShouldBeBlockedByProcess not supported by SystemIORunner");
}
public override void ReadAllText_FileShouldNotBeFound(string path)
{
this.ShouldFail<IOException>(() => { this.ReadAllText(path); });
}
private static void RetryOnException(Action action)
{
for (int i = 0; i < 10; i++)
{
try
{
action();
break;
}
catch (IOException)
{
Thread.Sleep(500);
}
catch (UnauthorizedAccessException)
{
Thread.Sleep(500);
}
}
}
private void ShouldFail<ExceptionType>(Action action) where ExceptionType : Exception
{
Assert.Catch<ExceptionType>(() => action());
}
}
}

Просмотреть файл

@ -0,0 +1,201 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="14.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProjectGuid>{0F0A008E-AB12-40EC-A671-37A541B08C7F}</ProjectGuid>
<OutputType>Exe</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>GVFS.FunctionalTests</RootNamespace>
<AssemblyName>GVFS.FunctionalTests</AssemblyName>
<TargetFrameworkVersion>v4.5.2</TargetFrameworkVersion>
<FileAlignment>512</FileAlignment>
<VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">10.0</VisualStudioVersion>
<VSToolsPath Condition="'$(VSToolsPath)' == ''">$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v$(VisualStudioVersion)</VSToolsPath>
<ReferencePath>$(ProgramFiles)\Common Files\microsoft shared\VSTT\$(VisualStudioVersion)\UITestExtensionPackages</ReferencePath>
<IsCodedUITest>False</IsCodedUITest>
<TestProjectType>UnitTest</TestProjectType>
<NuGetPackageImportStamp>
</NuGetPackageImportStamp>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Debug|x64'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>..\..\..\BuildOutput\GVFS.FunctionalTests\bin\x64\Debug\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\GVFS.FunctionalTests\obj\x64\Debug\</IntermediateOutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<DebugType>full</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'Release|x64'">
<OutputPath>..\..\..\BuildOutput\GVFS.FunctionalTests\bin\x64\Release\</OutputPath>
<IntermediateOutputPath>..\..\..\BuildOutput\GVFS.FunctionalTests\obj\x64\Release\</IntermediateOutputPath>
<DefineConstants>TRACE</DefineConstants>
<Optimize>true</Optimize>
<DebugType>pdbonly</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<PropertyGroup>
<StartupObject>GVFS.FunctionalTests.Program</StartupObject>
</PropertyGroup>
<PropertyGroup>
<RunPostBuildEvent>Always</RunPostBuildEvent>
</PropertyGroup>
<ItemGroup>
<Reference Include="Esent.Collections, Version=1.9.4.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\Microsoft.Database.Collections.Generic.1.9.4\lib\net40\Esent.Collections.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Esent.Interop, Version=1.9.4.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\ManagedEsent.1.9.4\lib\net40\Esent.Interop.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="Esent.Isam, Version=1.9.4.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\Microsoft.Database.Isam.1.9.4\lib\net40\Esent.Isam.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="nunit.framework, Version=3.5.0.0, Culture=neutral, PublicKeyToken=2638cd05610744eb, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\NUnit.3.5.0\lib\net45\nunit.framework.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="nunitlite, Version=3.5.0.0, Culture=neutral, PublicKeyToken=2638cd05610744eb, processorArchitecture=MSIL">
<SpecificVersion>False</SpecificVersion>
<HintPath>..\..\..\packages\NUnitLite.3.5.0\lib\net45\nunitlite.dll</HintPath>
<Private>True</Private>
</Reference>
<Reference Include="System" />
</ItemGroup>
<Choose>
<When Condition="('$(VisualStudioVersion)' == '10.0' or '$(VisualStudioVersion)' == '') and '$(TargetFrameworkVersion)' == 'v3.5'">
<ItemGroup>
<Reference Include="Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=10.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL" />
</ItemGroup>
</When>
<Otherwise />
</Choose>
<ItemGroup>
<Compile Include="Category\CategoryConstants.cs" />
<Compile Include="Program.cs" />
<Compile Include="Properties\Settings.Designer.cs">
<AutoGen>True</AutoGen>
<DesignTimeSharedInput>True</DesignTimeSharedInput>
<DependentUpon>Settings.settings</DependentUpon>
</Compile>
<Compile Include="Tests\EnlistmentPerFixture\DiagnoseTests.cs" />
<Compile Include="Tests\EnlistmentPerFixture\MountTests.cs" />
<Compile Include="Tests\EnlistmentPerFixture\GitFilesTests.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\RebaseTests.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\PrefetchVerbTests.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\CaseOnlyFolderRenameTests.cs" />
<Compile Include="Tests\EnlistmentPerFixture\MoveRenameFileTests.cs" />
<Compile Include="Tests\EnlistmentPerFixture\MoveRenameFileTests_2.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\PersistedSparseExcludeTests.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\PersistedWorkingDirectoryTests.cs" />
<Compile Include="Tests\EnlistmentPerFixture\MoveRenameFolderTests.cs" />
<Compile Include="Tests\EnlistmentPerTestCase\TestsWithEnlistmentPerTestCase.cs" />
<Compile Include="Tests\EnlistmentPerFixture\TestsWithEnlistmentPerFixture.cs" />
<Compile Include="Tests\EnlistmentPerFixture\GitCommandsTests.cs" />
<Compile Include="Tests\FastFetchTests.cs" />
<Compile Include="Tests\LongRunningEnlistment\GitMoveRenameTests.cs" />
<Compile Include="Tests\LongRunningEnlistment\MultithreadedReadWriteTests.cs" />
<Compile Include="Tests\LongRunningEnlistment\WorkingDirectoryTests.cs" />
<Compile Include="Tests\LongRunningEnlistment\GitReadAndGitLockTests.cs" />
<Compile Include="Tests\LongRunningEnlistment\LongRunningSetup.cs" />
<Compile Include="Tests\LongRunningEnlistment\TestsWithLongRunningEnlistment.cs" />
<Compile Include="Tests\EnlistmentPerFixture\WorkingDirectoryTests.cs" />
<Compile Include="Tests\PrintTestCaseStats.cs" />
<Compile Include="FileSystemRunners\BashRunner.cs" />
<Compile Include="FileSystemRunners\PowerShellRunner.cs" />
<Compile Include="FileSystemRunners\CmdRunner.cs" />
<Compile Include="FileSystemRunners\FileSystemRunner.cs" />
<Compile Include="Tools\GitHelpers.cs" />
<Compile Include="Tools\GitProcess.cs" />
<Compile Include="Tools\GVFSFunctionalTestEnlistment.cs" />
<Compile Include="Tools\NativeMethods.cs" />
<Compile Include="Tools\ProcessHelper.cs" />
<Compile Include="Tools\ProcessResult.cs" />
<Compile Include="Tools\TestConstants.cs" />
<Compile Include="Tools\GVFSProcess.cs" />
<Compile Include="Properties\AssemblyInfo.cs" />
<Compile Include="Should\FileSystemShouldExtensions.cs" />
<Compile Include="FileSystemRunners\ShellRunner.cs" />
<Compile Include="FileSystemRunners\SystemIORunner.cs" />
<Compile Include="Tools\ControlGitRepo.cs" />
</ItemGroup>
<ItemGroup>
<None Include="App.config">
<SubType>Designer</SubType>
</None>
<None Include="packages.config">
<SubType>Designer</SubType>
</None>
<None Include="Properties\Settings.settings">
<Generator>SettingsSingleFileGenerator</Generator>
<LastGenOutput>Settings.Designer.cs</LastGenOutput>
</None>
</ItemGroup>
<ItemGroup>
<Service Include="{82A7F48D-3B50-4B1E-B82E-3ADA8210C358}" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\GVFS.Tests\GVFS.Tests.csproj">
<Project>{72701BC3-5DA9-4C7A-BF10-9E98C9FC8EAC}</Project>
<Name>GVFS.Tests</Name>
</ProjectReference>
</ItemGroup>
<Choose>
<When Condition="'$(VisualStudioVersion)' == '10.0' And '$(IsCodedUITest)' == 'True'">
<ItemGroup>
<Reference Include="Microsoft.VisualStudio.QualityTools.CodedUITestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<Private>False</Private>
</Reference>
<Reference Include="Microsoft.VisualStudio.TestTools.UITest.Common, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<Private>False</Private>
</Reference>
<Reference Include="Microsoft.VisualStudio.TestTools.UITest.Extension, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<Private>False</Private>
</Reference>
<Reference Include="Microsoft.VisualStudio.TestTools.UITesting, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
<Private>False</Private>
</Reference>
</ItemGroup>
</When>
</Choose>
<Import Project="$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets" Condition="Exists('$(VSToolsPath)\TeamTest\Microsoft.TestTools.targets')" />
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
<PropertyGroup>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
</PropertyGroup>
<Error Condition="!Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets'))" />
<Error Condition="!Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" Text="$([System.String]::Format('$(ErrorText)', '..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets'))" />
</Target>
<Import Project="..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.MSBuild.4.7.54.0\build\StyleCop.MSBuild.Targets')" />
<Import Project="..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets" Condition="Exists('..\..\..\packages\StyleCop.Error.MSBuild.1.0.0\build\StyleCop.Error.MSBuild.Targets')" />
<PropertyGroup>
<PostBuildEvent>xcopy /Y $(SolutionDir)..\BuildOutput\GVFS\bin\$(Platform)\$(Configuration)\* $(TargetDir)
xcopy /Y $(SolutionDir)..\BuildOutput\GVFS.Mount\bin\$(Platform)\$(Configuration)\* $(TargetDir)
xcopy /Y $(SolutionDir)..\BuildOutput\FastFetch\bin\$(Platform)\$(Configuration)\* $(TargetDir)
xcopy /Y $(SolutionDir)..\BuildOutput\GVFS.Hooks\bin\$(Platform)\$(Configuration)\* $(TargetDir)</PostBuildEvent>
</PropertyGroup>
<PropertyGroup>
<PreBuildEvent>
</PreBuildEvent>
</PropertyGroup>
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
</Project>

Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше