January 8 2014 PA Topic - DevOps, Continuous Delivery, and Agile Teams

53 views
Skip to first unread message

Creg Schumann

unread,
Nov 27, 2013, 4:45:21 PM11/27/13
to practic...@googlegroups.com
We've had a lot of discussion with different teams, in some of our DevJamU courses, and others about DevOps and Continuous Delivery topics in relation to Agile Teams.

What are your teams experiencing?  What benefits are your team hoping for with regard to DevOps or CD/CI abilities?  Are there constraints you are facing that prevent CD/CI - constraints that would go away if….?

We're looking for some great lighting talkers for the January 8 session.  Being a lightning talker means you take 5-7 minutes to share your experience around DevOps and CD/CI within Agile Teams.  Maybe your experience is a story of how putting in a CD/CI environment helped the team achieve certain goals.  Maybe your story is about some tough constraints that prevent the "perfect" CD/CI environment and what you had to do given those constraints.

If you have an experience that you would like to share as a lightning speaker - contact Susan...@DevJam.com with a summary of your experience.

After our lightning talkers (we hope to have 4-5) share their experiences, the rest of the audience will submit questions they want to hear a conversation about in the Fish Bowl conversation that follows.

Look forward to some great stories and conversations!

Kyle Boon

unread,
Jan 9, 2014, 11:36:33 AM1/9/14
to practic...@googlegroups.com
Last night David mentioned a presentation about Development at Google. That presentation is where we took the idea from a monolithic code base from. This idea was sort of mind bending because we were moving away from a monolithic application and our default assumption was that required a separate code repository for each one. 


Another presentation I mentioned briefly is called "Chatops at Github" and references using a chat bot called Hubot for both automation and information right in your group chat tool. 


We're currently experimenting with this tool now. 

Kyle Boon

bill turner

unread,
Jan 10, 2014, 10:01:35 AM1/10/14
to practic...@googlegroups.com
Thanks for posting this, Kyle!


--
--
To Post: email to practic...@googlegroups.com
To Read Posts: http://groups.google.com/group/practical-agile?hl=en?hl=en
To The Website: https://sites.google.com/site/twincitiespracticalagility/
 
To Unsubscribe: email topractical-ag...@googlegroups.com
 
---
You received this message because you are subscribed to the Google Groups "Practical Agility" group.
To unsubscribe from this group and stop receiving emails from it, send an email to practical-agi...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.



--
Bill Turner
 
GR8 in the US conference - http://gr8conf.us/
Blog : Groovy/Grails Talk - http://www.changent.com
Twitter : bill_turner - http://twitter.com/bill_turner
LinkedIn : wltiii - http://www.linkedin.com/in/wltiii
Phone : +1.612.276.2135

An innovative Software Development Professional, Bill Turner is the co-founder and Principal Organizer for GR8 in the US, a conference on Groovy, Grails, Griffon and related technologies. He has worked in companies from emerging startups to Fortune 500s, and has consulted and lived internationally. The majority of his 30 plus years in the software development industry has been spent as a consultant, with 75% of the business being repeat/referral business. He has collaborated closely with individuals at every level of the organization and has performed a wide variety of roles. He knows software development. He understands business. He helps companies succeed.

Joseph Athman

unread,
Jan 10, 2014, 1:45:26 PM1/10/14
to practic...@googlegroups.com
I'm also pretty skeptical about the single monolithic codebase approach. I've heard Google talk about this several times but every time I hear about it I just keep thinking its a Sacred Cow that they have. IMHO one large codebase *may* reduce code duplication and allow for easier large scale refactoring, but at the expense of increased coupling and unintentional dependencies. One large codebase puts a ton of pressure on each developer to know exactly what pieces of code are expected to depend on each other. In my experience this isn't reasonable and it encourages a big ball of mud to happen. Making smaller and independent codebases makes the coupling much more intentional and harder to get wrong, with the understanding that certain code may end up getting duplicated. To me duplication is less of a concern than coupling so I favor the smaller codebases.

Joe

Josh Sheppard

unread,
Jan 10, 2014, 3:09:29 PM1/10/14
to practic...@googlegroups.com
Joe,

One way Google seems to tackle the knowledge of dependencies is by documenting them in the build AND documenting the dependencies on tests (i.e. which tests should be triggered by which code changes). This has the side effect of helping them optimize builds and tests to only what is necessary. In this way I suspect it is still not simple to know it all but you at least have a way of figuring it out quickly and testing if it is broken.

It does still depend on each dev to ensure that all of their changes that increase or change dependencies get documented in the build and test dependencies... But if doing so is part of their culture maybe that's OK too.

I think it would be hard for anyone else to build into that without a significant collective will and time. Someone mentioned "how much did that cost?" the other night which seems an appropriate question. How much does it cost to create and maintain and is that better than some alternative approach?

Josh

Joseph Athman

unread,
Jan 10, 2014, 5:12:02 PM1/10/14
to practic...@googlegroups.com
I've watched a couple talks by Google test engineers and I think the process they have for only testing what needs to be tested is very impressive, but I have always felt like they are solving their problem the wrong way. The problem isn't "how do we efficiently test a codebase with millions of lines of code and thousands of developers", the problem is "we have a codebase with millions of lines of code and thousands of developers". 

In my experience smaller is almost always better in software development. Smaller methods, smaller classes, smaller packages, smaller libraries...seems strange to say that one large codebase is better than several small ones.

Joe

Dion Stewart

unread,
Jan 10, 2014, 8:54:08 PM1/10/14
to practic...@googlegroups.com
I wasn’t at the session so apologies if I’m way off base due to missing context or a misunderstanding. I have a question…

Does Google really have a monolithic code base or do they have a bunch of small projects that are simply checked into “head” (or a “single branch” or however you want to define it)? Are those the same thing or is there a difference? 

I think they still have separate unique projects, it’s just that they don’t have a ton of branches for different versions of each project so they’re not proliferating dependency management hell.

Dion

Brandon Carlson

unread,
Jan 10, 2014, 8:56:52 PM1/10/14
to practic...@googlegroups.com
Ditto on not being there. My understanding is that they have a mega *repository* but separate projects/components in a single repo. I have no first hand experience, so I could be way off. 


On Friday, January 10, 2014, Dion Stewart wrote:
I wasn’t at the session so apologies if I’m way off base due to missing context or a misunderstanding. I have a question…

Does Google really have a monolithic code base or do they have a bunch of small projects that are simply checked into “head” (or a “single branch” or however you want to define it)? Are those the same thing or is there a difference? 

I think they still have separate unique projects, it’s just that they don’t have a ton of branches for different versions of each project so they’re not proliferating dependency management hell.

Dion

On Jan 10, 2014, at 5:12 PM, Joseph Athman <jjat...@gmail.com> wrote:

I've watched a couple talks by Google test engineers and I think the process they have for only testing what needs to be tested is very impressive, but I have always felt like they are solving their problem the wrong way. The problem isn't "how do we efficiently test a codebase with millions of lines of code and thousands of developers", the problem is "we have a codebase with millions of lines of code and thousands of developers". 

In my experience smaller is almost always better in software development. Smaller methods, smaller classes, smaller packages, smaller libraries...seems strange to say that one large codebase is better than several small ones.

Joe


On Fri, Jan 10, 2014 at 2:09 PM, Josh Sheppard <joshua....@gmail.com> wrote:
Joe,

One way Google seems to tackle the knowledge of dependencies is by documenting them in the build AND documenting the dependencies on tests (i.e. which tests should be triggered by which code changes). This has the side effect of helping them optimize builds and tests to only what is necessary. In this way I suspect it is still not simple to know it all but you at least have a way of figuring it out quickly and testing if it is broken.

It does still depend on each dev to ensure that all of their changes that increase or change dependencies get documented in the build and test dependencies... But if doing so is part of their culture maybe that's OK too.

I think it would be hard for anyone else to build into that without a significant collective will and time. Someone mentioned "how much did that cost?" the other night which seems an appropriate question. How much does it cost to create and maintain and is that better than some alternative approach?

Josh


On Fri, Jan 10, 2014 at 12:45 PM, Joseph Athman <jjat...@gmail.com> wrote:
I'm also pretty skeptical about the single monolithic codebase approach. I've heard Google talk about this several times but every time I hear about it I just keep thinking its a Sacred Cow that they have. IMHO one large codebase *may* reduce code duplication and allow for easier large scale refactoring, but at the expense of increased coupling and unintentional dependencies. One large codebase puts a ton of pressure on each developer to know exactly what pieces of code are expected to depend on each other. In my experience this isn't reasonable and it encourages a big ball of mud to happen. Making smaller and independent codebases makes the coupling much more intentional and harder to get wrong, with the understanding that certain code may end up getting duplicated. To me duplication is less of a concern than coupling so I favor the smaller codebases.

Joe


On Fri, Jan 10, 2014 at 9:01 AM, bill turner <bill....@changent.com> wrote:
Thanks for posting this, Kyle!


On Thu, Jan 9, 2014 at 10:36 AM, Kyle Boon <kyle....@gmail.com> wrote:
Last night David mentioned a presentation about Development at Google. That presentation is where we took the idea from a monolithic code base from. This idea was sort of mind bending because we were moving away from a monolithic application and our default assumption was that required a separate code repository for each one. 

--
To The Website: twincitiespracticalagility

Josh Sheppard

unread,
Jan 10, 2014, 8:59:42 PM1/10/14
to practic...@googlegroups.com
Dion,

I'm honestly not sure. This is the video that I remember the most from that pertains to the topic.

Kyle Boon

unread,
Jan 11, 2014, 10:25:47 AM1/11/14
to practic...@googlegroups.com
Our goal was actually to make sure that our developers did not need to know what depends on what. Instead the build knows the dependencies and the developers can easily build the entire system with a single command and see if a downstream project is affected.
Reply all
Reply to author
Forward
0 new messages