Running Graphics Intensive (Windows) Applications in the Cloud?

70 views
Skip to first unread message

rick.dane...@gmail.com

unread,
Mar 23, 2009, 8:53:20 PM3/23/09
to Cloud Computing
I am hoping to be able to find a way to run graphics intensive
applications in the cloud such as 3ds Max (3d modeling software) and
Adobe Premiere (video editing). So far I have tried this on Amazon ec2
and GoGrid on Windows server cloud machines but the programs will not
run properly.. I am almost sure that this is because these cloud
platforms have weak graphics support as they are more geared towards
web hosting.

I am wondering if anyone knows anything about this sort of thing and
if you know of any possible solutions. I would strongly prefer to use
a cloud set-up rather than buying the hardware myself. I have thought
that maybe it is somehow possible to make use of a local graphics card
over a high speed internet connection but I have no idea how to even
approach this.

I would greatly appreciate any input from any angle on this, I am
hoing I am not the only person out there trying to do this sort of
thing.

Thanks

shrimpy

unread,
Mar 23, 2009, 10:11:56 PM3/23/09
to cloud-c...@googlegroups.com
Hi ,

Isn`t it that, all the cloud machine are virtual machine?

i don`t think now the virtual machine is powerful enough for doing complex graphic stuff
(I just guess)

Regards.

edison su

unread,
Mar 23, 2009, 10:22:07 PM3/23/09
to cloud-c...@googlegroups.com
I think so, hypervisor, at least for open source Xen, which used in
Amazon EC2, doesn't optimize for graphic, hypervisor just emulates an
old graphic card for virtual machine, which has very bad performance
for graphic intensive application.
--
Best Regards,
Edison

rick.dane...@gmail.com

unread,
Mar 23, 2009, 10:29:16 PM3/23/09
to Cloud Computing
Hi,

Yes you are right about them being virtual machines, however from my
understanding this would not have a bearing on a machine's ability to
run graphics intensive applications, I would appreciate if someone
more knowledgeable than me could give their input on this.

Thanks

On Mar 23, 7:11 pm, shrimpy <imx...@gmail.com> wrote:
> Hi ,
>
> Isn`t it that, all the cloud machine are virtual machine?
>
> i don`t think now the virtual machine is powerful enough for doing complex
> graphic stuff
> (I just guess)
>
> Regards.
>

Christopher Steel

unread,
Mar 23, 2009, 10:50:38 PM3/23/09
to cloud-c...@googlegroups.com
The low end graphics card emulator is the least of the issues. Your
connection to the cloud will be the biggest bottleneck. Have you ever tried
to run a 3-D game over a Remote Desktop Connection or from an X-Server? The
current CC architecture is not conducive to running high-end graphics. It is
fine for off-line graphics processing, like video decoding, but I don't see
any Cloud-based FPS games anytime soon.

-Chris

rick.dane...@gmail.com

unread,
Mar 23, 2009, 10:56:15 PM3/23/09
to Cloud Computing
Thanks edison for the response.. I see.. so it seems this won't be
possible, however I think it would be possible in the future to make a
virtual machine that could run this with a better graphics card, I
guess there isn't the demand for it yet.. however I hope companies
like Autodesk will look at the SAAS model for the future, considering
that the software costs $4000 I think its a great candidate for a
SAAS / Cloud model.

larry shi

unread,
Mar 24, 2009, 12:52:04 AM3/24/09
to cloud-c...@googlegroups.com
Graphics applications like 3dmax don't work well in EC2. The reasons are

- Both the HW Amazon uses for running EC2 services and the SW (xen virtual client) are not designed for running FPU intensive applications such as graphics. Those enterprise servers have very poor FPU performance. They are simply optimized for database applications and web servers that typically don't require high FPU processing power. The servers running your EC2 instance simply don't have much built-in graphics processing capabilty except a simple primitive integrated video controller shared maybe by 8 - 16 CPU cores.

- All virtualization solutions out in the market including Xen as well have very poor graphics processing performance. You could blame Nvidia/AMD for not making their device drivers open sourced, Mircofost for its intended monopoly on graphics API, incomptence of  OpenGL ARB Working Group.  Anyway, virtualized 3D graphics still has a long way to go. Even someone may point out to you some virtual machines with graphics acceleration support. You will be disappointed at its performance. All of them only support OpenGL API 1.3 at most and have zero D3D support. This is just a reality.

thanks

- larry



--- On Mon, 3/23/09, rick.dane...@gmail.com <rick.dane...@gmail.com> wrote:

From: rick.dane...@gmail.com <rick.dane...@gmail.com>
Subject: [ Cloud Computing ] Re: Running Graphics Intensive (Windows) Applications in the Cloud?

Jeanne Morain

unread,
Mar 24, 2009, 4:11:33 AM3/24/09
to cloud-c...@googlegroups.com, b...@installfree.com
The approach that Larry highlights only applies to 1st and 2nd generation virtualization that is dependant on the Hypervisor and/or OS. 
 
There are more innovative out of the box approaches  for a new generation of virtualization technologies (like InstallFree) that provide virtualization via a shell integration with the host OS without the hypervisor or guest OS requirements.  This enables leveraging the local graphics card and compute processing power without having to worry about protocol presentation layer.  The application and/or desktop is obfuscated via encryption and seperate file system.   BiDirectional Read Write file system is architected to support check in/check out with out having to worry about offline versus online or significant duplication of data issues from local inventory tools, not being able to access critical applications offline, or needing to work disconnected (natural disaster, plane, etc).  Virtual applications can be delivered via normal mechanisms, streamed in block by block memory from the cloud or any File share. 
 
Because I hate marketing pitches in these type of forums I have to provide the disclaimer - yes I do work for InstallFree and have worked for others in this space.  I recommend doing an objective search and trying out different technologies that do not require remote display protocol to work if you are focusing on image intense programs like those you have listed or if you are working with critical apps like EchoCardiograms where a wrong interpretation could mean life or death - at least until the protocols advance to provide more frames per second and have less degradation as you infer to below.  Many in this space are making strides in that area - Redhat, Terradici, etc
 
Bob Sampson - b...@installfree.com can assist to setup initial walk through if you want to test it for your use case within a Shell Integrated Virtual Application structure or Virtual Desktop structure. 
 
Either way good luck - just remember with those types of applications - one disclaimer no matter WHO you use - is to make sure they can be licensed to be used in a multi-user model or at least each user has a valid access license.  Many of the ISVs are working toward a chargeback/rental approach but this area is still nascent.
 
Cheers,
Jeanne
 
 

 


From: larry shi <ccmo...@yahoo.com>
To: cloud-c...@googlegroups.com
Sent: Monday, March 23, 2009 9:52:04 PM

Paulo Calcada

unread,
Mar 24, 2009, 4:50:48 AM3/24/09
to cloud-c...@googlegroups.com
AMD is promoting a new Render Engine that would work on the Cloud, I think that for now its all about definition and ideas. But it looks great:

http://www.cloudviews.org/2009/03/amd-render-fusion-moving-from-the-meta-information-paradigm-to-a-pixel-information-paradigm/

Paulo

2009/3/24 larry shi <ccmo...@yahoo.com>

Jan Klincewicz

unread,
Mar 24, 2009, 9:13:00 AM3/24/09
to cloud-c...@googlegroups.com
You can partially blame the server vendors for cheaping out and saving a couple of bucks by using antiquated graphics processors, assuming servers will be relegated to "back end" workloads.  I suppose it is possible now to retrofit a PCIe graphic card (servers never had AGP) in most servers (though usually these slots are taken up with NICS and HBAs.)

It appears Xen merely emulates a vanilla Cirrus graphics chip (and if I'm not mistaken relies on Qemu code for that.)  I think the sheer size of 3D drivers scares off Xen developers.  I recall Citrix working on this for their XenDesktop product, but I am not sure they were doing anythinf at the hypervisor layer (despite having purchaases XenSource.)
--
Cheers,
Jan

Nik Simpson

unread,
Mar 24, 2009, 9:42:07 AM3/24/09
to cloud-c...@googlegroups.com
On 3/23/2009 7:53 PM, rick.dane...@gmail.com wrote:
> I am hoping to be able to find a way to run graphics intensive
> applications in the cloud such as 3ds Max (3d modeling software) and
> Adobe Premiere (video editing). So far I have tried this on Amazon ec2
> and GoGrid on Windows server cloud machines but the programs will not
> run properly.. I am almost sure that this is because these cloud
> platforms have weak graphics support as they are more geared towards
> web hosting.
>

The problem with graphics is that is very hardware intensive (both at
the cpu and the gpu level) and needs to move large amounts of data
between the OS and the graphics card at high speed. This is exactly the
sort of thing that virtualization isn't great at.

> I am wondering if anyone knows anything about this sort of thing and
> if you know of any possible solutions. I would strongly prefer to use
> a cloud set-up rather than buying the hardware myself. I have thought
> that maybe it is somehow possible to make use of a local graphics card
> over a high speed internet connection but I have no idea how to even
> approach this.

Offloading the graphics to a local GPU will work well for some graphics
intensive applications, but when you have to move large amounts of data
(say texture maps) between the OS and the graphics card, you are in
trouble. Consider, a high-speed internet connection might offer a few
Mbit/s of bandwidth with noticeable latency, whereas the PCIe bus offers
GB/s of bandwidth with very low latency.

>
> I would greatly appreciate any input from any angle on this, I am
> hoing I am not the only person out there trying to do this sort of
> thing.
>

I suspect that the long term solution may be technologies like
single-root I/O virtualization so that hypervisor can get of the way of
graphics processing. But as yet, I've not heard of anyone building an
SR-IOV capable graphics card. This may change as desktop virtualization
becomes more important in the enterprise, though today, that is
typically aimed at desktops that don't require high-performance graphics.

--
Nik Simpson

Nik Simpson

unread,
Mar 24, 2009, 9:44:17 AM3/24/09
to cloud-c...@googlegroups.com
On 3/24/2009 8:13 AM, Jan Klincewicz wrote:
> You can partially blame the server vendors for cheaping out and saving a
> couple of bucks by using antiquated graphics processors, assuming
> servers will be relegated to "back end" workloads. I suppose it is
> possible now to retrofit a PCIe graphic card (servers never had AGP) in
> most servers (though usually these slots are taken up with NICS and HBAs.)

Unfortunately, the graphics cards you see in high-end desktops use PCIe
x16 slots, which you don't find on servers, also they are power hogs and
physically quite large. So they present major challenges in terms of
slot availability, power, and cooling for typical 1U/2U servers used in
cloud applications.

--
Nik Simpson

Jeanne Morain

unread,
Mar 24, 2009, 11:58:55 AM3/24/09
to cloud-c...@googlegroups.com

Interesting thread -

Not all applications or systems are meant for the Cloud.  That is the reality of it.  No one to blame.  We should no more think that server based computing can accomodate all Desktop Use cases as we could that desktops could accomodate all server use cases. 

It boils down to design and purpose.  What is the context and intent and how is the application meant to be leveraged.  As the thread below - certain apps are great from the Cloud like data processing (proven by Google Docs, Salesforce.com, etc) while others like Adobe will fall short based on processing, network and other requirements that fall into place.

Many of tried and realized over the years that desktops and servers are very different and one size fits all doesn't work.  Although the average shoe size is a size 8 - would we all want people to have to live with that size?  What about those that are a 16 or a 4?  Point being - there are alternatives that allow check in/check out and in previous threads the cloud will evolve - has no choice - if it is going to be successful beyond Web Based Apps and VDI - because the use case for those has proven somewhat limited over the years with the vast majority (90+%) still wanting or needing to go offline for various reasons like local graphics compute power, natural disasters etc.

Clouds need the ability to morph to offline as well as online usage - coming from the desktop in not server out -  at the end of the day it is all about the user experience with their apps nothing else.

----- Original Message ----
From: Nik Simpson <n...@alaweb.com>
To: cloud-c...@googlegroups.com
Sent: Tuesday, March 24, 2009 6:44:17 AM
Subject: [ Cloud Computing ] Re: Running Graphics Intensive (Windows) Applications in the Cloud?


Andrew Badera

unread,
Mar 24, 2009, 12:32:21 PM3/24/09
to cloud-c...@googlegroups.com
Or ... is it more that "the" cloud simply not suited for all applications, yet?

There are plenty of reasons, outside the cloud itself, that intensive graphic processing doesn't work well -- bandwidth, communication protocols to support video over given bandwidth, etc. etc. In time, I'm sure the cloud could serve this purpose just as well as any other.

However, for the time being, the preponderance of customer need and technological capability/cost drives the cloud to focus on server applications. In time though, like the cloud has shown us the past couple years, graphic-heavy applications will commoditize in the cloud, just like everything else to date.

Debashish Sarkar

unread,
Mar 24, 2009, 1:06:51 PM3/24/09
to cloud-c...@googlegroups.com
As and when the other obstacles are out of the way, perhaps the cloud will be a great alternative for intensive and/or graphing processing.
 
As a real life example, I have come across engineering teams of mid-sized companies wanting to find appropriate resources for CPU intensive processing - they get very frustrated when they hear the cost of in-house servers requiring high horsepower - especially when there needs are discrete - they do not need such resources 24/7/365. A cloud could be an excellent choice. Of course IP will be a concern, but then that is a different topic for discussion.
 
Thanks

Paulo Calcada

unread,
Mar 24, 2009, 1:01:22 PM3/24/09
to cloud-c...@googlegroups.com
Despite the fact that I already have post this here I think that is worthy to post it again:

http://www.cloudviews.org/2009/03/amd-render-fusion-moving-from-the-meta-information-paradigm-to-a-pixel-information-paradigm/
http://www.cloudviews.org/2009/03/cloud-gaming-would-it-be-possible/

These services or "thoughts"  could be viewed as simple intentions, but I think this show how Cloud Computing could be used to run almost everything...

Paulo


2009/3/24 Andrew Badera <and...@badera.us>

Nik Simpson

unread,
Mar 24, 2009, 1:44:39 PM3/24/09
to cloud-c...@googlegroups.com
On 3/24/2009 11:32 AM, Andrew Badera wrote:
> Or ... is it more that "the" cloud simply not suited for all
> applications, yet?
>
> There are plenty of reasons, outside the cloud itself, that intensive
> graphic processing doesn't work well -- bandwidth, communication
> protocols to support video over given bandwidth, etc. etc. In time, I'm
> sure the cloud could serve this purpose just as well as any other.
>
> However, for the time being, the preponderance of customer need and
> technological capability/cost drives the cloud to focus on server
> applications. In time though, like the cloud has shown us the past
> couple years, graphic-heavy applications will commoditize in the cloud,
> just like everything else to date.
>

The trouble is that what constitues a "graphics intensive" application
is a moving target. When I joined Intergraph in the mid-80s, 2D cad was
graphics intensive, yet today you can buy a PC with enough horsepower
(both graphics and compute) to extensive realtime rendering of 3D CAD
models. Same goes for image processing, my camera today produces images
that are substantially larger (in terms of memory footprint) than the
max physical memory size of VAX machines we were using back then.

So, I expect that as fast as we solve the bandwidth problems for today's
graphics intensive apps, new apps will emerge with even greater demands
on CPU/GPU/memory size/bandwidth etc will emerge. For example, today,
engineers offload thinks computational fluid modeling as batch jobs, but
I would expect in the future that engineering CAD applications will be
running these in real time.

For myself, I believe there will always be a class of applications that
will not fit the cloud model very well, and that availability of local
low-cost, high-performance compute and graphics will always be required.

--
Nik Simpson

Jan Klincewicz

unread,
Mar 24, 2009, 3:01:27 PM3/24/09
to cloud-c...@googlegroups.com
I think we are all pretty  much in agreement that CC is not, and possibly never will be (never say never) all thi gs to all people.  As I am now running 20Mb FIOS up and down, I do things I never would have conceived with my 300 baud modem (what seems like) just a few years ago.

It seems to me bandwidth/latency are the big blockers here, but for the foreseeable future, there will likely be a mix of  local and remote computational power.  I don't see that this is a big deal.    What we used to sell as "graphics workstations" for 10K a decade ago are out powered by cheap laptops today (and so it continues.)

Thin clients are not much less expensive (in components) than quad-core desktops with 4+ GB of RAM and 10K RPM RAID drives.

I think the driving force behind most of this is management and maintenance of images / Operating Systems / Application patching / Security / what-have-you.    If the dozen folks in the design group needs dedicated hardware to get a job done, that's no big deal next to the 100+ "knowledge workers" screwing up their images daily downloading rogue apps from the web etc.

As you imply, Nik, computation needs always seem to grow to consume all the power allotted to it.  How much fun would this be if things were otherwise ??
--
Cheers,
Jan

Andrew Badera

unread,
Mar 24, 2009, 3:05:13 PM3/24/09
to cloud-c...@googlegroups.com
Well, I'll give you that yes, traditionally, graphics have been the fastest-evolving aspect of PC hardware ... but at some point, we're bound to reach biological limits, where more graphics power can't possibly translate into an improved user experience. Though perhaps by that point, we're moving into holographic imagery or some such new level.

Greg Pfister

unread,
Mar 25, 2009, 7:00:12 PM3/25/09
to Cloud Computing
On Mar 24, 10:58 am, Jeanne Morain <jmor...@yahoo.com> wrote:
> Interesting thread -
>
> Not all applications or systems are meant for the Cloud.  That is the reality of it.  No one to blame.  We should no more think that server based computing can accomodate all Desktop Use cases as we could that desktops could accomodate all server use cases. 

Likely there are things that can't be well-done in a cloud, but I
don't think this is one of them.

Paolo pointed to it above. Miss it? This is exactly why AMD is doing
Render Cloud. (http://tinyurl.com/9teo9r) Elevator summary:

Build a cloud with standard servers *AND* attach to them big graphics
offload engines (ATI/AMD Fusion, NVIDIA CUDA (not from AMD,
obviously), Intel Larrabee, etc.).

Do all the graphics rendering and image creation on the cloud, but
don't try to display it there. It's just bits. Use what would be the
display buffer contents to create video, and stream that to the user.

HD video would be as good as a locally-attached graphics on most
displays.

Mmmm, tasty bandwidth charges. Pretty major response-time
requirements, too. But it certainly can work. But not today. Someday.
If the demand is there.

Somehow I suspect the Grid guys are likely to do this first, since
they've got the visualization requirements. Maybe it'll be a toss-up
between them and the game guys (MMORPGs like World of Warcraft, etc.).
Play WoW on your cell phone! -- or with a crummy cheap graphics card
in your laptop.

Greg Pfister
http://perilsofparallel.blogspot.com/

Greg Pfister

unread,
Mar 25, 2009, 7:44:39 PM3/25/09
to Cloud Computing
Well, it looks as though one company trying this for a game:
http://groups.google.ca/group/cloud-computing/browse_frm/thread/f0144fabc6ab43da?hl=en

Latency and cost shot it down. They said it might still be OK for
AutoDesk or other CAD.

So much for this iteration on the system they were using, anyway.

Greg Pfister
http://perilsofparallel.blogspot.com/

Greg Pfister

unread,
Mar 25, 2009, 8:09:39 PM3/25/09
to Cloud Computing
Ah, but they were trying to run it on a regular cloud...

Latency will still be a problem, though, at least for twitch games.

Greg Pfister
http://perilsofparallel.blogspot.com/

On Mar 25, 6:44 pm, Greg Pfister <greg.pfis...@gmail.com> wrote:
> Well, it looks as though one company trying this for a game:http://groups.google.ca/group/cloud-computing/browse_frm/thread/f0144...
>
> Latency and cost shot it down. They said it might still be OK for
> AutoDesk or other CAD.
>
> So much for this iteration on the system they were using, anyway.
>
> Greg Pfisterhttp://perilsofparallel.blogspot.com/

Jeanne Morain

unread,
Mar 26, 2009, 8:15:44 PM3/26/09
to cloud-c...@googlegroups.com

Makes sense - not all applications will be viable for Cloud based computing until both the technology and the requirements are more well defined.

Gaming does not scare me as much as medical applications like those used to render EchoCardiograms or MRI applications in the short term that could mean life or death.


----- Original Message ----
From: Greg Pfister <greg.p...@gmail.com>
To: Cloud Computing <cloud-c...@googlegroups.com>
Sent: Wednesday, March 25, 2009 5:09:39 PM
Subject: [ Cloud Computing ] Re: Running Graphics Intensive (Windows) Applications in the Cloud?


Ivo Murris

unread,
Apr 2, 2009, 4:54:27 PM4/2/09
to cloud-c...@googlegroups.com
Hi, maybe late but let me give my 2 cents' worth:

Graphically intensive applications are bound by two things: the backend
compute and the graphics capabilities of the display engine used on the
desktop which hosts the application. Inside the cloud, the first can be
provisioned easily by throwing more (virtual) CPU's into the fight. The
second isn't solved that easy because the problem comes from three
parts: The virtual display driver, the remoting protocol used to get the
interface to the local desktop and the graphics capabilities of the
local desktop.

Let's say you are running a remote 3D graphics application (MathCAD,
Mathematica, AutoCAD etc) or are playing a HD DVD on a remote desktop.
The remote application creates the detailed image or video by doing the
needed calculations and then calling the API of the underlying OS to
display it. It does this by using the graphics driver and capabilities
of the virtual graphics adapter. The OS then converts this into a
network stream which is send over the network or Internet to the local
desktop of the user. This stream is converted back into 'real' API calls
to the local graphics driver to show the images. Using a remoting
protocol like RDP6, this wouldn't work as the capabilities just aren't
there for high resolution graphics and video. That's why a lot of VDI
(Virtual Desktop Infrastructure) vendors have been using other remoting
protocols or creating extensions to RDP. With the coming of RDP 7
(Windows 7 for the remote desktop hosting the application), this will
become MUCH better and actually remote video editing and high graphics
intensive applications should be possible
(http://computerboom.blogspot.com/2008/12/microsoft-demos-remote-desktop
-7.html). If you can't wait for this to arrive, Companies like Citrix
(http://www.citrix.com/English/NE/news/news.asp?newsID=1686302) and
Teradici (http://www.teradici.com/teradici.php) actually can provide
this now. Now we wait for the companies to deploy it and give you a
video editing workstation on the Net for free on demand :-).

Kind regards,

Ivo Murris

-----Original Message-----
From: cloud-c...@googlegroups.com
[mailto:cloud-c...@googlegroups.com] On Behalf Of
rick.dane...@gmail.com
Sent: dinsdag 24 maart 2009 1:53
To: Cloud Computing
Subject: [ Cloud Computing ] Running Graphics Intensive (Windows)
Applications in the Cloud?


dan cox

unread,
Apr 2, 2009, 6:14:32 PM4/2/09
to cloud-c...@googlegroups.com
Ivo, you're right on. The graphical capabilities can not be a cloud based
function but a display management function driven by the desktop or
workstation suitably equipped with the right graphic libraries and any
graphics assist engine required. The large scale number crunching for CFD or
other simulation applications can be distributed cloud wise but the
presentation of the results need to be local for reasons of data stream size
and pixel management.
Reply all
Reply to author
Forward
0 new messages