My .NET development team has been massively restructured recently, and as a result of the restructure, we now have the ability to do things like set up source control and database backups, and development databases. However, none of us is really a DBA, and so we have no clue how to do this on the DB side.
We would like to have some way to make it so that developers can be working with real data for testing, but are unable to change the data on the production server. Right now, we just have one database, so if we need to test a change to the website, we have to make dummy records to test.
Basically, we know this is a bad setup, but we don't know any better, and we really want to change. I can't find anything online that gives an overview of how to actually set up a development environment which can be used for both application development and database development.
I've looked at How to develop a database (workflow)? and How to setup local database development process for small web team?, but the first one doesn't address my problem, and the second one assumes a lot of knowledge that I don't have. I don't know anything about "normal" database development practices, rollout/rollback scripts, how to write tests for database development, schema comparison tools, etc. (These are all terms I've seen in my search, but I don't know where to go to learn about them.)
My team is very averse to using TFS. I'm not sure why (I've never used it myself), but they like Git, and I know Git, so that is the source control we've decided on. My main question was actually how to set up separate servers for the different environments, while having current data to work with in development.
Furthermore because you're coming at it from a developer workflow point of view you'll likely discover that this kind of information doesn't get talked about much. You're likely going to have to pay for a consultant to come in, help you set up the dev server with basic backups and maintenance, and otherwise create a workflow for you.
If you were determined to do it yourself you'd search YouTube for videos on SSDT or SQL Server Database Projects which is part of Visual Studio; most of what you want is covered in some way by that (unit testing, source control through TFS or similar, deployments through msbuild, and also even schema compares). Unit testing is also commonly done with a framework called tSQLt. Deployments are sometimes done with Flyway. Database backups and maintenance are usually done with Ola Hallengren scripts or Minion Backup / Reindex.
For inexpensive non-tailored training Pluralsight has a video on each of these topics above and other database administration. But if you're not a DBA then you probably don't want to waste time on learning all of these, and you're still going to be stuck with creating your own workflow. There is no "developer workflow" from soup to nuts.
Red Gate does a lot of SQL Server developer software (in particular SQL Compare, SQL Source Control, and SQL Tests). You could search for their training videos for some ideas about how a workflow in these or other tools might look like. Again you can find these on YouTube, though they also publish some free downloadable books as part of their marketing.
I'm a proponent of using SQL Server Data Tools (SSDT) for development and source control. Most places I've seen that are .NET developers use Team Foundation Server (TFS) so natural step is to use SSDT for the database piece. Then you can treat your database development similar to your .NET development (assuming you have separate servers for the different environments). Using SSDT gives you source control, schema and data comparison tools, and a way to build the deployment of the database into your normal build scripts.
Source control is just one component of TFS and it can integrate with Git for the source control part no problem but that's not what you were seeking. As far as how to set up servers, each person that answers may have a different way of configuring the servers. Ideally, you would want your application server separate from your database server, have SQL Server services run with the separate, dedicated accounts, have the application use a dedicated, domain account to connect to the database, and have real data only in production. Anything else would depend on your available configuration options.
I'm following a tutorial that talks about having two servers working concurrently. This leaves me a bit confused - what's the difference between a web and a dev server, and how you can have two servers working simultaneously?
From what I understand of your question, this tutorial seems to be talking about 2 separate environments, a production environment (i.e. where the 'real' code that clients/users will use runs), and a development environment (i.e. where code that is currently under testing and not ready for the 'real world' runs).
The reason you want (at least) 2 of them is so you can deploy your code to the dev environment and then test it out, see if it works, how it works, and if it breaks anything, without risking the real server going out, or maybe accidentally wiping your real customer database or something like that. When you deploy to your dev environment and see that everything work as it should and nothing is broken, then you can put that version of the code to run in your real, production environment, and be pretty sure your backend is not gonna go out while you are developing
Long story, short. Development servers are used to quickly develop applications. They provide a runtime environment which is incredibly useful regardless of the size of your project and where you stand in your developer journey.
Once you've install both, we should be ready to start. For this guide, we'll create a regular Vanilla.js project (only using HTML, CSS and JavaScript). Go ahead launch VS Code and open a terminal window by navigating to View > Terminal.
Go ahead and play with it. You will quickly learn why working with a Development Server is an industry standard. You only need to be working on your files and hitting cmd + s every time you want to see a change reflected on the server.
I'm the dev lead for Java development and wanted to setup a dedicated development box with git SCM and a jenkins CI server. I was then going to look at something like nexus to archive our builds that we ship to customers.
I've worked in places that allow this kind of permissions (including the current company in a different era) and IMO it helps to drive innovation and progress. This is only a dev box so there's no production data to worry about. The current regime seems overly restrictive and is preventing me from doing my job.
The git example took over 2 years to be setup and I'm still waiting for jenkins. Maybe because they don't really understand what it is. This is hugely frustrating but I actually like the company so I'm wondering how to approach this.
In your case however, you're asking for admin access to the development server to set up tools, not as a developer. So you really should be asking to be the owner/admin of that server with all the responsibilities for keeping it running and maintained. If you have the kind of org that has an admin for that server, you'll need to communicate with him directly so that it is set up for the needs of the developers - so he'll be the one to install Jenkins and set up the user account.
For a dev server, I think letting any dev do whatever they like to it is a bad idea, someone will set things up that they want and break it for others, so its best to have a central point (either an admin or senior dev) as the owner of the box and responsible for the admin activities that are needed for it.
I stopped using Windows on my own PCs about two years ago. Not because I don't like Windows, I actually really like Windows 10 (keep reading), but when you're running Linux servers, it's just so much easier to run the same OS on both your development PC and your production servers. That and when you're administering servers, as you so often do when you have a homelab, you just can't beat the command line of Linux.
Being back on Windows, I am quickly reacquainted with all of the things I missed about Windows. I now have Launchy back. I used Albert on Ubuntu, but he's no Launchy. On Windows, I also have access to all of the "creative" applications only available to "mainstream" consumer OSes, specifically the Adobe suite of products. I don't currently know too much about graphic design, but I've decided that its a key skill for the developers of the future and as such I will be teaching it to myself this year. And then there's PuTTY- oh how I've missed PuTTY. I love its ability to save connection settings, but what I've missed most of all is that, in PuTTY, simply highlighting a row of text copies it to the clipboard. None of that separately right-clicking-to-copy-from-the-drop-down-menu or accidentally-using-the-wrong-shortcut-key-comb-and-sending-a-break-command-to-the-terminal mess. Yes, I know, PuTTY is also available for Linux but you just don't have the same need for it in Linux where a quick launch of the terminal puts you right into SSH.
So now that I am on Windows, while all of my servers run Linux, I find myself in the Operation Market Garden of DevOps. I feel as though I am on the wrong side of the bridge (if you don't get the reference, check out the movie A Bridge Too Far).
Maybe you find yourself in the same boat. Or maybe you haven't been forced back onto Windows, maybe that's just what your laptop came with, maybe that's what your work uses, or maybe you just genuinely like Windows.
A remote development server, as the name implies, is just a server (typically a VM) whose sole purpose is for remote development. I know, I've basically given a circular reference definition, but that's because a remote development server really is just that simple. It's just a shell VM that can be used to store works-in-progress and run them remotely. Best of all, most IDEs support them.
c01484d022