--
© 2017 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
Email preferences: You received this email because you signed up for the Google Compute Engine Discussion Google Group (gce-discussion@googlegroups.com) to participate in discussions with other members of the Google Compute Engine community and the Google Compute Engine Team.
---
You received this message because you are subscribed to the Google Groups "gce-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gce-discussion+unsubscribe@googlegroups.com.
To post to this group, send email to gce-discussion@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gce-discussion/8ebe6c97-2330-4978-9f08-80d2bfd69364%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
We don't launch new PVMs before the old go down. You may want to consider using a shutdown script to create a new PVM when you get notification that it's being preempted.
On Jul 1, 2017 7:54 PM, "Liran Gabay" <gabay...@gmail.com> wrote:
--Hi Carlos,Do you maybe know if gce compute engine launch new preemptible vm before the current preemptible vm goes down (target size specified by the autoscaler) or the new preemptible vm is launch only when the current preemptible vm goes down ?Thanks,Liran
On Thursday, June 29, 2017 at 10:15:22 PM UTC+3, Carlos (Cloud Platform Support) wrote:Hi Liran,
Due to the nature of preemptible VMs your application must be fault-tolerant. They can be terminated at any time, even before the 24 hours period passes. Therefore I do not think that even if you keep track on when they are created, it will help. If you do not want or can't use regular VMs, maybe you could have a combination of different managed instances groups running preemptive and non-preemptible VMs as a similar approach suggests here.
© 2017 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
Email preferences: You received this email because you signed up for the Google Compute Engine Discussion Google Group (gce-dis...@googlegroups.com) to participate in discussions with other members of the Google Compute Engine community and the Google Compute Engine Team.
---
You received this message because you are subscribed to the Google Groups "gce-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gce-discussio...@googlegroups.com.
To post to this group, send email to gce-dis...@googlegroups.com.
--
© 2018 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
Email preferences: You received this email because you signed up for the Google Compute Engine Discussion Google Group (gce-discussion@googlegroups.com) to participate in discussions with other members of the Google Compute Engine community and the Google Compute Engine Team.
---
You received this message because you are subscribed to the Google Groups "gce-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gce-discussion+unsubscribe@googlegroups.com.
To post to this group, send email to gce-discussion@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/gce-discussion/f7348299-3d81-4d13-b395-e601949bdd80%40googlegroups.com.
I have a script that works great. I use it in production. Just needs gcloud sdk and Linux.
On Mar 15, 2018 8:01 AM, "Grant Carthew" <gr...@carthew.net> wrote:
> I have found that for some reason Google preempt it right after you start it up...--
Quote from the Google documentation: Generally, Compute Engine avoids preempting too many instances from a single customer and will preempt instances that were launched most recently. This might be a bit frustrating at first, but in the long run, this strategy helps minimize lost work across your cluster. Compute Engine doesn't bill for instances preempted in the first 10 minutes, so you also save on costs.
From this link: https://cloud.google.com/compute/docs/instances/preemptible
On Tuesday, July 11, 2017 at 1:40:57 AM UTC+10, Team Life wrote:I ran moodle this way for several months. You have to setup a Linux cron with the Cloud Console Command to restart the instance every 5 minutes or whatever from another instance somwheree much traffic on my moodle but I wouldn't suggest doing this for a live server it's just meant for testing or very little traffic. I have found that for some reason Google preempt it right after you start it up at least that happened frequently to me. But with Centos 7 plus LAMP stack I never got any corruption. You can also setup up a clean shutdown script to run so you get a clean shutdown. But I never even did that never got any corruption with that setup.
On Thursday, June 29, 2017 at 6:53:12 AM UTC-6, Liran Gabay wrote:HiI created preemptible instances in a managed instance group behind LB, I specified the preemptible option in the instance template.Because GCP Compute Engine always terminates preemptible instances after they run for 24 hours, I need a way to launch new preemptible VMs before they goes down.Is GCP Compute Engine launch new preemptible VMs before the current ones goes down when target size specified by the autoscaler ?Maybe need to replace the Preemptible VMs several hours before they going down,I am not sure what is the best way to do it, maybe with termination notices.Basically what I need is always to keep on Preemptible VMs behind the LB.What is the best way to implement this?Thanks
© 2018 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
Email preferences: You received this email because you signed up for the Google Compute Engine Discussion Google Group (gce-dis...@googlegroups.com) to participate in discussions with other members of the Google Compute Engine community and the Google Compute Engine Team.
---
You received this message because you are subscribed to the Google Groups "gce-discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to gce-dis...@googlegroups.com.
To post to this group, send email to gce-dis...@googlegroups.com.