Youcan use version control commands to do nearly all Team Foundation Version Control (TFVC) tasks that you can do in Visual Studio. You can also use version control commands to do several tasks that can't be done in Visual Studio. To run version control commands from a command prompt or within a script, you use the tf.exe tool.
For Visual Studio 2019 and later versions, the tf.exe binary is no longer in a fixed location in the Visual Studio install path as in some previous releases, for example, C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE. If your script uses tf.exe, don't hard-code a path to the file based on the Visual Studio install path.
In most cases, you run the version control command in the context of a directory that's mapped in the workspace. For example, $/SiteApp/Main/ is mapped to c:\\code\\SiteApp\\Main\\. To get the latest version of all items in the workspace, use the following command:
Your workspace is a local copy of your team's codebase. Because it's a local copy on your development machine, you can develop and test your code in isolation until you're ready to check in your work. Here are some commands to manage your workspace:
Non-bracketed arguments are required. [Brackets] indicate optional arguments that aren't required to complete a command. However, some optional arguments have defaults that are applied to the command even if you don't specify the option.
Items that aren't enclosed in brackets are options that you include verbatim. Items enclosed in angle brackets () are arguments that you must replace with actual characters to perform a command.
You use an item specification to specify the items affected by a command. You can specify items either on a client machine or on your Azure DevOps server. You can use wildcard characters such as * and ?.
You typically use server item specification arguments when you need to run a command on items that aren't on the client machine. For example, say you're working on a development machine. If you need to get some revision history data about some items that are in a project collection that you don't work in, you can use the following command:
Use the /noprompt option to suppress requests for data input and redirect output data to the command prompt window. This option can be useful when you need to use version control commands in a script where:
The problem lies with something legacy that we try not to use anymore wherever possible, but in this case, there seems to be an exception. In Azure DevOps, the default for version control was once (actually not that long ago) "Team Foundation Version Control" (TFVC) and to be able to get your code locally, you needed to set up a so called "workspace". That workspace took care of the mapping of the files in version control and the files on the local computer. You could do all kinds of cool stuff with it (and sometimes they went out out of control). These workspaces were also needed in the context of the build. at the start of the build, a workspace was created and the required mapping was put in place in order for the build to be able to work. After the build was done, the files (and the workspace) were removed. This worked fine, except when builds failed: In this case, the cleanup (did not always) happen and the result was that sometimes the build server tried to "create" a workspace with a name that was already in use. (This is an issue as duplicates workspace names are not allowed in TFVC). This means that from time to time, some kind of cleanup needed to be done and typically we used tools like TFS Sidekicks for this. Because the support of this thing stopped with Visual studio 2015, I tried to do it differently:
every installation of visual studio comes with an executable that is called tf.exe. It allows you to automate/script stuff in TFS/AzureDevOps and has been around for a while. There are better alternatives now (such as the Azure CLI and the Azure DevOps SDK), but in this case, and since we are working with some legacy, this approach seemed applicable...
note: the servername can also be *, but then I really suggest to also work with the (optional) /owner flag as you will otherwise list all workspaces of all users on all machines. In essence there is no issue in doing so when listing the workspaces, but it is a whole different matter when starting the cleanup.
Then you would think that cleaning all these workspaces at once, can then be done by adding the '/remove' flag to the previous command. Unfortunately, it's not that simple. It depends on the current config of your pc; If you have the workspaces on your machine, then you can work try to do it with the workspaces subcommand. Otherwise, you have to work with the workspace command, which is described a bit lower!
This means that you will have to get smart with what you add to the /remove flag as filter. adding a * would remove all the workspaces that are found and in our context, that would mean that all workspaces on al computers of all users would be deleted and that is perhaps not what you want. In my context there are 2 options of which I assume that they should work:
So, as the workspaces command did not work for me, I had to resort to another approach. The alternative is the tf workspace /delete command, which accepts a /delete flag. This allows you to specify one (1) workspace that you want to delete. It is not ideal, but this was the road that I went down and that I wanted to complete. So I wrote some (very basic) PowerShell to allow me to extract the workspace name from the table output from the tf workspaces command that was used earlier.
Running this script, will loop over all the workspaces and delete them one by one. The "only" issue that I could not overcome in this case, is that the /delete subcommand does not seem to accept something to confirm up front or even to force, which means that the deletion of each workspace needs to be deleted. In this case, I could live with that as this is most likely something that I will not have to be doing again anytime soon...
I showed you how you can clean up old TFVC workspaces with relative ease and now you should be able to do so too. Be aware that Microsoft prefers you to use git and that TFVC is not actively supported anymore. If you want to migrate away from TFVC to git (within Azure DevOps), then you can automate this. I have done it and will write about it soon!
Oh! It works there. Btw this was my favorite approach to work with Team Foundation Server until I thought what if this would work from Visual Studio Package Manager Console which is using PowerShell.
Awesome. So why I like this approach? If I am working on something in regular PowerShell and to find pending changes I have to navigate to my working directory to do tf status. But with Package Manager Console I can just do tf status and it gives me pending changes for just that project and that particular branch that I am in. Try doing pwd it will give the current location of your project. And since docking in VS is supported for all the windows I just dock Package Manager Console to another monitor.
We have been using the Team Foundation Create Work, Map working folder etc. to communicate with TFS on-premises server. We have moved some of our code to Azure DevOps (VSTS) and want to utilize the same build scripts, but unfortunately we cannot get these actions to authenticate against Azure DevOps.
All permission are in place in Azure DevOps and we can make it work if we first run tf.exe /login command by itself then a pop up promt appears from Azure DevOps where we input credentials and then afterwards we can create a new workspace etc.
When tf.exe (beta 3 refresh version) displays an error message the text is colored in yellow on black. My command prompt windows usually have black text on white background. Is it possible to let tf. exe display all output without changing the color, e.g. via an environment variable?
If you simply want to alter the coloring to make it look better, you could use something like the following. The color choices are black, blue, cyan, darkblue, darkcyan, darkgray, darkgreen, darkmagenta, dark red, darkyellow, gray, green, magenta, red, white, and yellow.
LinkedIn and 3rd parties use essential and non-essential cookies to provide, secure, analyze and improve our Services, and to show you relevant ads (including professional and job ads) on and off LinkedIn. Learn more in our Cookie Policy.
Lots of things in the software development world cost money. Some of these money pits are obvious and clearly visible, while others lurk in the shadows, obscured by the guise of an aesthetically pleasing and well-recognized logo. Lack of efficiency creates productivity parasites that induce thousands of cuts on your development time either directly or by proxy, causing hours upon hours to be lost across a team over the course of a year.
Anyone who has time in the professional development world on the Microsoft stack has probably encountered a product known as Team Foundation Server. This "collaboration platform" is the evolution of Microsoft's original server-based source control system: Visual Source Safe. TFS not only offers tight integration with the development environment, but a bug tracking system that is available in-IDE as well as from the browser (and by browser I mean IE).
The concept sounds great, but in keeping with the established pattern of Microsoft offerings, the solution doesn't exactly live up to its marketing-painted image. Under the covers of this platform exists a dark and evil underbelly of tediosity, bad performance, design flaws, weak features and counter-intuitive processes.
I am going to articulate some of the shortcomings and provide reasons why other source control technologies like Git, Subversion, and Mercurial are superior, but before I get started, I have one request. Before you read the description of an inefficiency and cannot fathom that in a logical world that a source control system could possibly behave in the manner described, realize one thing: Microsoft has gone to great lengths to marginalize the inefficiencies using naming conventions, ambiguous descriptions, and world-level marketing techniques. Its primary purpose is not to efficiently manage your source control, but to oppressively preside over it to a point that it is difficult to pry yourself away from the product.
3a8082e126