Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Could 7-zip make use of GPU?

4,381 views
Skip to first unread message

Man-wai Chang

unread,
Sep 6, 2010, 10:30:07 AM9/6/10
to

--
@~@ Might, Courage, Vision, SINCERITY.
/ v \ Simplicity is Beauty! May the Force and Farce be with you!
/( _ )\ (x86_64 Ubuntu 9.10) Linux 2.6.35.4
^ ^ 22:29:01 up 10 days 36 min 1 user load average: 1.07 1.15 1.16
不借貸! 不詐騙! 不援交! 不打交! 不打劫! 不自殺! 請考慮綜援 (CSSA):
http://www.swd.gov.hk/tc/index/site_pubsvc/page_socsecu/sub_addressesa

VanguardLH

unread,
Sep 6, 2010, 10:57:29 AM9/6/10
to
Man-wai Chang wrote:

<yep, a blank post>
<blank body = blank post = blank mind>


When did 7-zip become a video editing or rendering program? There's a
game called 7-zip?

GPU = Graphics Processing Unit (it's on your video card)
http://en.wikipedia.org/wiki/Gpu

CPU = Central Processing Unit
http://en.wikipedia.org/wiki/Cpu

Jeremy

unread,
Sep 6, 2010, 11:20:29 AM9/6/10
to

If they added CUDA support, yes, but as it is now I don't believe it
does.

Man-wai Chang

unread,
Sep 6, 2010, 11:52:40 AM9/6/10
to
> If they added CUDA support, yes, but as it is now I don't believe it
> does.

CUDA is too vendor-specific.. DXVA?

--
@~@ Might, Courage, Vision, SINCERITY.
/ v \ Simplicity is Beauty! May the Force and Farce be with you!
/( _ )\ (x86_64 Ubuntu 9.10) Linux 2.6.35.4

^ ^ 23:48:01 up 10 days 1:55 1 user load average: 1.20 1.13 1.15

Man-wai Chang

unread,
Sep 6, 2010, 11:53:24 AM9/6/10
to
> When did 7-zip become a video editing or rendering program? There's a
> game called 7-zip?

Just curious whether 7-zip could make use of the core(s) of modern GPU...

--
@~@ Might, Courage, Vision, SINCERITY.
/ v \ Simplicity is Beauty! May the Force and Farce be with you!
/( _ )\ (x86_64 Ubuntu 9.10) Linux 2.6.35.4

^ ^ 23:48:01 up 10 days 1:55 1 user load average: 1.20 1.13 1.15

poutnik

unread,
Sep 6, 2010, 2:50:05 PM9/6/10
to
In article <i632r7$baf$2...@news.eternal-september.org>,
toylet...@gmail.com says...

>
> > When did 7-zip become a video editing or rendering program? There's a
> > game called 7-zip?
>
> Just curious whether 7-zip could make use of the core(s) of modern GPU...

I guess it possibly could, but it would be highly inefficient
and therefore such implementation is very unlikely.

GPU is very specialized computer
and is not generally usable for all kind of operations.
Data compression, not based on lossy DCT
or other mmedia stuff, is unlikely to be
what GPU likes to process.

Imagine yourself CPU like a toolbox,
and GPU like a hammer.

Hammer/GPU is great in processing
nails/video or multimedia

But Hammer/GPU is inferior
for processing other stuff
like screws or general computing.

some data is very very hard to preprocess
for GPU to be able to eat such data,
and CPU-GPU data interface can have overhead
much more demanding then GPU processing part,
or CPU only processing.

--
Do you know what is difference
between Windows versions and guided missiles ?
None. Both are fired and forgotten.

Message has been deleted

VanguardLH

unread,
Sep 6, 2010, 5:58:01 PM9/6/10
to
Man-wai Chang wrote:

>> When did 7-zip become a video editing or rendering program? There's a
>> game called 7-zip?
>
> Just curious whether 7-zip could make use of the core(s) of modern GPU...

Companies have been considering the merging of CPU and GPU, and why AMD
acquired ATI back in 25-Oct-2006. See:

see http://tinyurl.com/2cxlv93 (Jun 2010)
http://www.xbitlabs.com/news/cpu/display/20061107081152.html (Nov 2006)

The GPU+CPU merge allows for faster speeds between the 2 processing
units due to not having to pump up the signal to get it off the die or
being constrained by external and comparatively slow busses. Companies
have always been striving to put more on a die but heat remains the
enemy (as well as reliability due to failed junctions and having to
include redundant logic which generates even more heat). I worked on a
project that put 3 mainframes on a single 3" die. It had to be
submerged in liquid nitrogen before use. Then there is also the
difference in what is technically doable but the reality of what
consumers will pay. Closer coupling provides faster synergy but bars
component-level upgradability. Not all consumers would want to pay for
a mobo with closely coupled CPU+GPU+maxRAM with a starting price tag of
somewhere over $2500 before even buying the case, PSU, and hard disks so
they'd have a 6+ year lifespan before the machine got outpriced cheaper
with newer technology.

The instruction sets for each will probably remain separate and distinct
despite getting performed on the same die. That is, the instructions
will get merged, not get absorbed into each other. The functions for
numerical crunching are not best performed by a GPU. The NPU/FPU has
long been integrated into the CPU. GPUs have been bulking up their
number crunching capabilities and why the fusion of GPU and CPU are
planned. However, the Intel instruction set is well known and expected
on the consumer platforms. How do you know which GPU will happen to be
present in current-day consumer-grade platforms? And why would a
programmer bother to code for a specific GPU rather than simply issue
the system API calls and let the system transparently figure out what to
do? Maybe OpenCL might take off but it's too early to tell (remember
when Java was touted as the wonderful cross-platform language but just
how universal has it become?), especially with Apple fucking it up with
trademarks rights trying to keep closed a supposedly open standard. To
what advantage can you partake of the strives in GFLOPS for floating
point computing capacity in GPUs for computations that involve integer
arithmetic which gains nothing with increased FP precision? FP math in
GPUs probably exceeds what is available in CPUs hence another reason to
merge the two, but I'm not sure just how much FP math is involved in
encoding/decoding of bytes.

Have a read here:
http://www.nvidia.com/content/PDF/fermi_white_papers/N.Brookwood_NVIDIA_Solves_the_GPU_Computing_Puzzle1.pdf

It's coming. It's not here yet. When it is prevalent, it will be
translucent so a program like 7-zip is still not going to code
specifically for a GPU since, by then, there won't be a GPU or CPU, and
instead an APU (or whatever grows out of the merger, stabilizes, find
prominent adoption, and accept by consumers).

poutnik

unread,
Sep 7, 2010, 2:52:16 AM9/7/10
to
In article <i63o1a$gpi$1...@news.albasani.net>, V...@nguard.LH says...

>
> Man-wai Chang wrote:
>
> >> When did 7-zip become a video editing or rendering program? There's a
> >> game called 7-zip?
> >
> > Just curious whether 7-zip could make use of the core(s) of modern GPU...
>
> Companies have been considering the merging of CPU and GPU, and why AMD
> acquired ATI back in 25-Oct-2006. See:
>
> see http://tinyurl.com/2cxlv93 (Jun 2010)
> http://www.xbitlabs.com/news/cpu/display/20061107081152.html (Nov 2006)
>
.............

>
> It's coming. It's not here yet. When it is prevalent, it will be
> translucent so a program like 7-zip is still not going to code
> specifically for a GPU since, by then, there won't be a GPU or CPU, and
> instead an APU (or whatever grows out of the merger, stabilizes, find
> prominent adoption, and accept by consumers).

I would say it is nothing in contradiction I have said before.

It will significantly decrease overhead for tasks
GPU is suitable for and will give them a boost.

But for general computing is more efficient
to put more CPU cores on chip instead.

GPU task programming needs much effort,
similar to SSEn multimedia procesing.

Programmers use SSEn instructions
only if there is expected enough speed gain
worthy of their effort.

( From discussions with SSEn based C++ library developers
for Avisynth video processing )

--
Poutnik

0 new messages