Chromium PPC64 port

1,296 views
Skip to first unread message

dftx...@free.fr

unread,
Aug 9, 2018, 5:37:17 PM8/9/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
Hello,

Along with the #talos-workstation community on Freenode IRC, I've been working on porting Chromium to PPC, we've got a development environment on ppc64le, build system is working.

Raptor Computing Systems sells Desktop POWER9 systems under the name of Talos II (I own one), we currently lack a browser with JS JIT, Chromium seems like the shortest path to this goal along with WebKit, which is why we're putting effort in this port.

We'll be trying to keep track on that page, https://wiki.raptorcs.com/wiki/Porting/Chromium

For now it is just a very dirty draft of the instructions to get the dev env working, to actually start porting code, then the goal is to get changes merged upstream. This page will probably become a more structured resource to know what is left to do.

Don't hesitate to join #talos-workstation on Freenode to ask questions and eventually contribute.

dftx...@free.fr

unread,
Aug 15, 2018, 10:17:44 PM8/15/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
We finished porting sandbox, webrtc and currently finishing breakpad / crashpad, and then we can start with Chromium's core code (if necessary).
We will probably have an important amount of work to port for big endian PowerPC here. Little Endian PowerPC exists so it will be working there first.

Don't hesitate to comment here to show your interest.

Dan Horák

unread,
Sep 5, 2018, 2:40:32 PM9/5/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
Hi, I'm definitely interested, QT WebEngine is based on Chromium and becomes mandatory for a growing number of projects, some switching from QtWebKit.

Shawn Anastasio

unread,
Sep 22, 2018, 3:12:47 AM9/22/18
to Chromium-dev, tpea...@raptorengineering.com, j...@jaesharp.com
A quick status update.

Major progress had been made and the browser now compiles, runs, and performs well in multiple online javascript benchmarks.

The biggest issue as of now is the presence of major motion-related artifacting when rendering VP9 videos, as demonstrated in this screenshot:

It is my understanding that chromium's VP9 rendering is handled through libvpx, which already boasts ppc64le support and passes all unit tests when compiled out-of-tree.

If anybody is able to offer any insight on the issue, it would be greatly appreciated.

-- Shawn

Shawn Anastasio

unread,
Sep 23, 2018, 7:54:54 PM9/23/18
to Chromium-dev, tpea...@raptorengineering.com, j...@jaesharp.com
Upon further investigation by tpearson, it seems that all relevant media unit tests pass which makes this issue even more perplexing.

Specifically, the tests produced by the ninja target "media_unittests" all pass, with the exception of one which also fails on ARM: https://chromium.googlesource.com/chromium/src/+/b355adc86e9d4635a3f1a23567035f96bd39d428.

This, at the very least, seems to indicate that chromium's unit tests for vp9 playback are incomplete or faulty.
Some input from those involved with the chromium media bits would be greatly appreciated.

--Shawn

PhistucK

unread,
Sep 24, 2018, 2:28:31 AM9/24/18
to shawnan...@gmail.com, Chromium-dev, tpea...@raptorengineering.com, j...@jaesharp.com
For media related discussions, I suggest https://groups.google.com/a/chromium.org/d/forum/media-dev.

PhistucK


--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
---
You received this message because you are subscribed to the Google Groups "Chromium-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chromium-dev...@chromium.org.
To view this discussion on the web visit https://groups.google.com/a/chromium.org/d/msgid/chromium-dev/e6edd18d-7ad3-4023-b3c8-8e7a39abc0b8%40chromium.org.

Timothy Pearson

unread,
Sep 24, 2018, 3:18:18 PM9/24/18
to Shawn Anastasio, Chromium-dev, j...@jaesharp.com
After looking into the flaw some, I'm unclear as to whether libvpx,
ffmpeg (libavcodec), or something else is handling the decode. libvpx
seems to be functioning correctly when built out of tree in order to run
the unit tests. I am not sure why it would be failing inside Chrome
unless there is another library in play.

Which decoder library or libraries are actually used to display vp9
content in the latest Chrome source tree?

Thanks!
--
Timothy Pearson
Raptor Engineering
+1 (415) 727-8645 (direct line)
+1 (512) 690-0200 (switchboard)
https://www.raptorengineering.com

Timothy Pearson

unread,
Sep 24, 2018, 3:18:34 PM9/24/18
to Shawn Anastasio, Chromium-dev, j...@jaesharp.com
Here's a screenshot of the browser working on ppc64. Note the VP9 issue
as the only remaining blocker; once that's fixed we should be good to
start upstreaming / provide built binaries via the various Linux distros.
youtube_example_ppc64.png

Timothy Pearson

unread,
Sep 24, 2018, 8:27:50 PM9/24/18
to Shawn Anastasio, Chromium-dev, j...@jaesharp.com
Since my messages were posted to the list out of order, I figured I
should follow up with current status.

Thanks to the hints at
https://groups.google.com/a/chromium.org/forum/#!topic/media-dev/sh3dVaFPAqY
we have determined it is a libvpx issue related to VSX / AltiVec
acceleration. We have since rebuilt the browser with libvpx using
generic code, and the result appears to be working properly. We will be
starting to submit code for upstreaming shortly and will post links here
to the various patches on the code review system.

Timothy Pearson

unread,
Sep 25, 2018, 8:21:18 AM9/25/18
to Shawn Anastasio, Chromium-dev, j...@jaesharp.com
The first of these patches is now up for review here:

https://chromium-review.googlesource.com/c/crashpad/crashpad/+/1242633

Sandip Giri

unread,
Nov 13, 2018, 2:25:32 PM11/13/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
 Hi,

 

I'm trying to validate the chrome on power8 using the steps provided here - https://wiki.raptorcs.com/wiki/Porting/Chromium

However, 7 patches are failing (might be due to the recent changes in chromium source). PFA file for the details.

 

Has anyone updated these patches to work with latest chromium source? or These patches were created for specific chromium branch or commit ?

 

Thanks,

Sandip


patches-failure-log.txt

Shawn Anastasio

unread,
Nov 13, 2018, 3:24:24 PM11/13/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
Hi Sandip,

I've updated the wiki with patches that apply against the latest chromium git. Note that the crashpad patch failure is harmless and a few of the patches are no longer needed, including one of the pdfium patches and the gcc-only patches.
That leaves the following patches which needed updating:
0001-DONTMERGE-Disable-v8-unit-tests.patch
0001-pdfium-allocator-Use-64k-page-sizes-on-ppc64.patch
0001-Properly-detect-little-endian-PPC64-systems.patch
0001-services-service_manager-sandbox-linux-Fix-TCGETS-de.patch

Please try again with the latest versions of the above patches from the wiki.

Thanks,
Shawn

Sandip Giri

unread,
Nov 16, 2018, 7:59:50 AM11/16/18
to Chromium-dev
Thanks Shawn! Now all the patches have been applied successfully and Chrome built without any issue.
$ ./out/Default/chrome --version
Chromium 72.0.3611.0

1) I have validated the chrome-headless on ubuntu ppc vm , looks like it's working fine.
To validate chrome-headless , I have used the steps mentioned here - https://developers.google.com/web/updates/2017/04/headless-chrome , see below :

Starting Headless (CLI)
The easiest way to get started with headless mode is to open the Chrome binary from the command line. If you've got Chrome 59+ installed, start Chrome with the --headless flag:
$ chrome --version
Chromium 72.0.3611.0
$ chrome --headless --disable-gpu --remote-debugging-port=9222 https://www.chromestatus.com --no-sandbox
DevTools listening on ws://127.0.0.1:9222/devtools/browser/8d226099-b69a-4b78-b4f4-7ff2d1a81f01

Command line features
In some cases, you may not need to programmatically script Headless Chrome. There are some useful command line flags to perform common tasks.
**Printing the DOM**
The --dump-dom flag prints document.body.innerHTML to stdout:
$ chrome --headless --disable-gpu --dump-dom https://www.chromestatus.com/
**Create a PDF**
The --print-to-pdf flag creates a PDF of the page:
$ chrome --headless --disable-gpu --print-to-pdf https://www.chromestatus.com/ --no-sandbox
[1116/065342.750493:INFO:headless_shell.cc(534)] Written to file output.pdf
**Taking screenshots**
To capture a screenshot of a page, use the --screenshot flag:
$ chrome --headless --disable-gpu --screenshot https://www.chromestatus.com/ --no-sandbox
[1116/065440.525950:INFO:headless_shell.cc(534)] Written to file screenshot.png.
# Size of a standard letterhead.
$ chrome --headless --disable-gpu --screenshot --window-size=1280,1696 https://www.chromestatus.com/ --no-sandbox
[1116/065533.417761:INFO:headless_shell.cc(534)] Written to file screenshot.png.

And few more commands are there for testing the chrome-headless browser.


2) Now started validation of chrome.
To validate chrome need VNC setup. Installed and Configured VNC on Ubuntu vm.

Tried to launch the chrome browser on VNC setup , however currently it's failing with the below error :
$ chrome --version
Chromium 72.0.3611.0
$ chrome www.google.com --no-sandbox
(process:1953): Gtk-WARNING **: 12:01:17.442: Locale not supported by C library.
Using the fallback 'C' locale.

(chrome:1953): Gtk-WARNING **: 12:01:17.443: cannot open display: localhost:5903

Looking into this.

3) Also, I have started validating the chrome and chrome-headless with one of the node module which needs chrome/chrome-headless to execute the tests.
git clone https://github.com/recharts/recharts.git
cd recharts
npm install
npm test # Test execution needs CHROME browser

Initially I was getting the below error :
15 11 2018 12:44:09.505:INFO [karma-server]: Karma v3.1.1 server started at http://0.0.0.0:9876/
15 11 2018 12:44:09.505:INFO [launcher]: Launching browsers Chrome with concurrency unlimited
15 11 2018 12:44:09.633:INFO [launcher]: Starting browser Chrome
15 11 2018 12:44:09.635:ERROR [launcher]: No binary for Chrome browser on your platform.
Please, set "CHROME_BIN" env variable.

The issue got resolved after setting CHROME_BIN="/path/to/chrome/binary"
However , currently I'm getting the below issue :
15 11 2018 12:46:22.421:INFO [karma-server]: Karma v3.1.1 server started at http://0.0.0.0:9876/
15 11 2018 12:46:22.421:INFO [launcher]: Launching browsers Chrome with concurrency unlimited
15 11 2018 12:46:22.425:INFO [launcher]: Starting browser Chrome
15 11 2018 12:46:22.459:ERROR [karma-server]: { Error: spawn EACCES
at _errnoException (util.js:1022:11)
at ChildProcess.spawn (internal/child_process.js:323:11)
at exports.spawn (child_process.js:502:9)
at spawnWithoutOutput (/home/build-user/browser-test/recharts/node_modules/karma/lib/launchers/process.js:166:26)
at Object.ProcessLauncher._execCommand (/home/build-user/browser-test/recharts/node_modules/karma/lib/launchers/process.js:75:21)
at Object.ProcessLauncher._start (/home/build-user/browser-test/recharts/node_modules/karma/lib/launchers/process.js:33:10)
at Object.<anonymous> (/home/build-user/browser-test/recharts/node_modules/karma/lib/launchers/process.js:19:10)
at emitOne (events.js:121:20)
at Object.emit (events.js:211:7)
at Object.BaseLauncher.start (/home/build-user/browser-test/recharts/node_modules/karma/lib/launchers/base.js:43:10)
at Object.jobs.add [as j] (/home/build-user/browser-test/recharts/node_modules/karma/lib/launcher.js:85:17)
at Object.setTimeout.bind.j (/home/build-user/browser-test/recharts/node_modules/qjobs/qjobs.js:143:18)
at ontimeout (timers.js:482:11)
at tryOnTimeout (timers.js:317:5)
at Timer.listOnTimeout (timers.js:277:5) code: 'EACCES', errno: 'EACCES', syscall: 'spawn' }
npm ERR! Test failed. See above for more details.

Looking into this.


Thanks,
Sandip

Vaibhav Sood

unread,
Nov 20, 2018, 4:24:19 PM11/20/18
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
Hi,

I am trying to apply the breakpad patch https://wiki.raptorcs.com/w/images/f/f7/0001-Implement-ppc64-support-on-linux.patch to standalone breakpad sources (cloned sources as per here: https://chromium.googlesource.com/breakpad/breakpad/)

The patch applies correctly, breakpad depends on the LSS header file so applied this patch to the LSS header: https://wiki.raptorcs.com/wiki/File:0001-WIP-ppc64-support.patch

This however leads to build errors. However, as per this thread, all patches(including breakpad) apply/build/run correctly when applied to the Chrome source tree 

My queries:

1) Is breakpad ported completely to ppc64le?
2) If yes, is there a difference to get breakpad to run correctly in standalone versus when it is part of the Chrome source?

My observations: under built Chrome sources with all power patches from this thread applied, :~/chromium/src/out/Default/obj/third_party/breakpad binary exists. 
However, breakpad in standalone has processor/minidump_stackwalk, processor/minidump_dump etc binaries which i could not find under the built Chrome source tree


On Friday, August 10, 2018 at 3:07:17 AM UTC+5:30, dftx...@free.fr wrote:

Sandip Giri

unread,
Nov 22, 2018, 8:20:36 AM11/22/18
to Chromium-dev

Hi ,

 

Chrome and ChromeHeadless validation Summary on ppc64le:

 

  1. Chrome built successfully using the steps provided here - https://wiki.raptorcs.com/wiki/Porting/Chromium
  2. Launched both the browsers successfully.
  1. Chrome command :

$ chrome --no-sandbox

  1. ChromeHeadless command :
$ chrome --headless --disable-gpu --remote-debugging-port=9222 https://www.chromestatus.com  --no-sandbox                        

 

  1. Also , successfully validated both the browsers with one of the node module , which needs these browsers to execute the tests.

Thanks,

Sandip

Vaibhav Sood

unread,
Jan 22, 2019, 4:01:08 PM1/22/19
to Chromium-dev, tpea...@raptorengineering.com, shawnan...@yahoo.com, j...@jaesharp.com
Hi,


I however dont see CFI or stack walking methods implemented in the stackwalker_ppc64.cc similar to those on intel or arm

Is this functionality missing and if yes will it be added to the patch?

I have made an attempt to add it here: https://chromium-review.googlesource.com/c/breakpad/breakpad/+/1426283 , however the CFI part is not working correctly, will appreciate any help to get it to work if this patch can be useful


(intel trace below this trace, ppc64 has no Found by 'call frame info' or 'stack scanning' frames)
CPU: ppc64
     16 CPUs

GPU: UNKNOWN

Crash reason:  DUMP_REQUESTED
Crash address: 0x1035f63c
Process uptime: not available

Thread 16 (crashed)
 0  blusparkd!google_breakpad::ExceptionHandler::WriteMinidump() + 0x7c
   srr0 = 0x1035f63c    r1 = 0x7d63bbd0
    Found by: given as instruction pointer in context

Thread 0
 0  libc-2.17.so!__poll_nocancel + 0x84
   srr0 = 0x88097ad8    r1 = 0xe64251f0
    Found by: given as instruction pointer in context
 1  blusparkd!main [blusparkd.C : 625 + 0x4]
   srr0 = 0x100f0698    r1 = 0xe6425250
    Found by: previous frame's frame pointer
 2  libc-2.17.so!generic_start_main.isra.0 + 0x138
   srr0 = 0x87fa5100    r1 = 0xe6425d40
    Found by: previous frame's frame pointer
 3  libc-2.17.so!__libc_start_main + 0xbc
   srr0 = 0x87fa52f4    r1 = 0xe6426020
    Found by: previous frame's frame pointer

Thread 1
 0  libpthread-2.17.so!__pthread_cond_wait + 0x16c
   srr0 = 0x8817e72c    r1 = 0x86f7e430
    Found by: given as instruction pointer in context

Thread 2
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x8676dd80
    Found by: given as instruction pointer in context

Thread 3
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x85f5dd80
    Found by: given as instruction pointer in context

Thread 4
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x8574dd80
    Found by: given as instruction pointer in context

Thread 5
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x84f3dd80
    Found by: given as instruction pointer in context

Thread 6
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x8472dd80
    Found by: given as instruction pointer in context

Thread 7
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x83f1dd80
    Found by: given as instruction pointer in context

Thread 8
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x8370dd80
    Found by: given as instruction pointer in context

Thread 9
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x82efdd80
    Found by: given as instruction pointer in context

Thread 10
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x826edd80
    Found by: given as instruction pointer in context

Thread 11
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x81eddd80
    Found by: given as instruction pointer in context

Thread 12
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x816cdf40
    Found by: given as instruction pointer in context

Thread 13
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x80ebdd80
    Found by: given as instruction pointer in context

Thread 14
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x806add80
    Found by: given as instruction pointer in context

Thread 15
 0  libc-2.17.so!__nanosleep_nocancel + 0x74
   srr0 = 0x8805ef48    r1 = 0x7fe9dd80
    Found by: given as instruction pointer in context

CPU: amd64
     family 6 model 6 stepping 3
     8 CPUs

GPU: UNKNOWN

Crash reason:  DUMP_REQUESTED
Crash address: 0x7297d3
Process uptime: not available

Thread 13 (crashed)
 0  blusparkd!google_breakpad::ExceptionHandler::WriteMinidump() [exception_handler.cc : 670 + 0xd]
    rax = 0x00007fb9c68bf6a0   rdx = 0x0000000000000000
    rcx = 0x00000000007297b2   rbx = 0x0000000000000000
    rsi = 0x0000000000000001   rdi = 0x00007fb9c68bf418
    rbp = 0x00007fb9c68bfe80   rsp = 0x00007fb9c68bf380
     r8 = 0x0000000000000000    r9 = 0x63303261662d3662
    r10 = 0x00007fb9d3681490   r11 = 0xfffffffffffffff8
    r12 = 0x0000000000000000   r13 = 0x0000000000801000
    r14 = 0x0000000000000004   r15 = 0x00007fb9c68e3700
    rip = 0x00000000007297d3
    Found by: given as instruction pointer in context
 1  libbluspark.so!ibm_htap::services::CrashReporter::reportCrash(int) [new_allocator.h : 81 + 0x4c6]
    rbx = 0x0000000000000000   rbp = 0x00007fb9c68bfe80
    rsp = 0x00007fb9c68bf9d0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9c68e3700   rip = 0x00007fb9d8721346
    Found by: call frame info
 2  libbluspark.so!ibm_htap::services::AssertTrap(std::string const&) [new_allocator.h : 81 + 0x1b]
    rbx = 0x0000000000000000   rbp = 0x00007fb9c68bfee0
    rsp = 0x00007fb9c68bfe90   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9c68e3700   rip = 0x00007fb9d86b5afb
    Found by: call frame info
 3  libbluspark.so!ibm_htap::engine::DataBase::createTableInline(std::string const&) [format.h : 974 + 0x184e]
    rbx = 0x0000000000000000   rbp = 0x00007fb9c68c0ee0
    rsp = 0x00007fb9c68bfef0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9c68e3700   rip = 0x00007fb9d87b19ce
    Found by: call frame info
 4  libbluspark.so!ibm_htap::engine::STQ::formAndAdd(ibm_htap::engine::OperatorT, std::string const&, std::string const&, ibm_htap::engine::AttrPosVec const&, std::vector<std::string, std::allocator<std::string> >&, int, ibm_htap::types::Array<ibm_htap::types::Type> const&) [stl_tree.h : 679 + 0x10a6d]
    rbx = 0x0000000000000000   rbp = 0x00007fb9c68de3e0
    rsp = 0x00007fb9c68c0ef0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9c68e3700   rip = 0x00007fb9d6f4cb2d
    Found by: call frame info

Thread 0
 0  libc-2.17.so!__poll_nocancel + 0x24
    rax = 0xfffffffffffffdfc   rdx = 0x0000000000001388
    rcx = 0xffffffffffffffff   rbx = 0x0000000000000000
    rsi = 0x0000000000000001   rdi = 0x00007fff0d98c858
    rbp = 0x00007fff0d98c8f0   rsp = 0x00007fff0d988310
     r8 = 0x0000000000000000    r9 = 0x00007fff0d9881a0
    r10 = 0x0000000000000000   r11 = 0x0000000000000293
    r12 = 0x0000000000528f21   r13 = 0x00007fff0d98c9d0
    r14 = 0x0000000000000000   r15 = 0x0000000000000000
    rip = 0x00007fb9d297fe9d
    Found by: given as instruction pointer in context
 1  blusparkd!main + 0x669d
    rbx = 0x0000000000000000   rbp = 0x00007fff0d98c8f0
    rsp = 0x00007fff0d988320   r12 = 0x0000000000528f21
    r13 = 0x00007fff0d98c9d0   r14 = 0x0000000000000000
    r15 = 0x0000000000000000   rip = 0x0000000000531fed
    Found by: call frame info
 2  libc-2.17.so!__libc_start_main + 0xf5
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fff0d98c900   r12 = 0x0000000000528f21
    r13 = 0x00007fff0d98c9d0   r14 = 0x0000000000000000
    r15 = 0x0000000000000000   rip = 0x00007fb9d28ae3d5
    Found by: call frame info
 3  blusparkd!_start + 0x29
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fff0d98c9c0   r12 = 0x0000000000528f21
    r13 = 0x00007fff0d98c9d0   r14 = 0x0000000000000000
    r15 = 0x0000000000000000   rip = 0x0000000000528f4a
    Found by: call frame info
 4  0x7fff0d98c9c8
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fff0d98c9c8   r12 = 0x0000000000528f21
    r13 = 0x00007fff0d98c9d0   r14 = 0x0000000000000000
    r15 = 0x0000000000000000   rip = 0x00007fff0d98c9c8
    Found by: call frame info

Thread 1
 0  libpthread-2.17.so!__pthread_cond_wait + 0xc5
    rax = 0xfffffffffffffe00   rdx = 0x0000000000000001
    rcx = 0xffffffffffffffff   rbx = 0x00007fb9d4904088
    rsi = 0x0000000000000080   rdi = 0x0000000002c7066c
    rbp = 0x00007fb9ce8f2c10   rsp = 0x00007fb9ce8f2bb0
     r8 = 0x0000000002c70600    r9 = 0x0000000000000000
    r10 = 0x0000000000000000   r11 = 0x0000000000000246
    r12 = 0x0000000000000001   r13 = 0x0000000002c21e30
    r14 = 0x0000000000000004   r15 = 0x00007fb9ce8f3700
    rip = 0x00007fb9d2c64945
    Found by: given as instruction pointer in context
 1  libstdc++.so.6.0.20!std::condition_variable::wait(std::unique_lock<std::mutex>&) [gthr-default.h : 864 + 0x8]
    rbx = 0x00007fb9d4904088   rbp = 0x00007fb9ce8f2c10
    rsp = 0x00007fb9ce8f2be0   r12 = 0x0000000000000001
    r13 = 0x0000000002c21e30   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d343ef7c
    Found by: call frame info
 2  libaws-cpp-sdk-core.so!void std::condition_variable::wait<LogThread(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool)::{lambda()#1}>(std::unique_lock<std::mutex>&, LogThread(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool)::{lambda()#1}) + 0x29
    rbx = 0x00007fb9d4904088   rbp = 0x00007fb9ce8f2c10
    rsp = 0x00007fb9ce8f2bf0   r12 = 0x0000000000000001
    r13 = 0x0000000002c21e30   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d4904689
    Found by: call frame info
 3  libaws-cpp-sdk-core.so!LogThread(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool) + 0xd6
    rbx = 0x00007fb9d4904088   rbp = 0x00007fb9ce8f2cc0
    rsp = 0x00007fb9ce8f2c20   r12 = 0x0000000000000001
    r13 = 0x0000000002c21e30   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d490415e
    Found by: call frame info
 4  libaws-cpp-sdk-core.so!void std::_Bind_simple<void (*(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::basic_ofstream<char, std::char_traits<char> > >, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> >, bool))(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool)>::_M_invoke<0ul, 1ul, 2ul, 3ul>(std::_Index_tuple<0ul, 1ul, 2ul, 3ul>) + 0xa6
    rbx = 0x00007fb9d4904088   rbp = 0x00007fb9ce8f2d10
    rsp = 0x00007fb9ce8f2cd0   r12 = 0x0000000000000001
    r13 = 0x0000000002c21e30   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d490a2c2
    Found by: call frame info
 5  libaws-cpp-sdk-core.so!std::_Bind_simple<void (*(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::basic_ofstream<char, std::char_traits<char> > >, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> >, bool))(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool)>::operator()() + 0x1d
    rbx = 0x0000000002c21e00   rbp = 0x00007fb9ce8f2d50
    rsp = 0x00007fb9ce8f2d20   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d4909fe5
    Found by: call frame info
 6  libaws-cpp-sdk-core.so!std::thread::_Impl<std::_Bind_simple<void (*(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::basic_ofstream<char, std::char_traits<char> > >, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> >, bool))(Aws::Utils::Logging::DefaultLogSystem::LogSynchronizationData*, std::shared_ptr<std::ostream> const&, std::basic_string<char, std::char_traits<char>, Aws::Allocator<char> > const&, bool)> >::_M_run() + 0x1c
    rbx = 0x0000000002c21e00   rbp = 0x00007fb9ce8f2d70
    rsp = 0x00007fb9ce8f2d60   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d4909e7c
    Found by: call frame info
 7  libstdc++.so.6.0.20!execute_native_thread_routine [thread.cc : 84 + 0x3]
    rbx = 0x0000000002c21e00   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce8f2d80   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d3442c80
    Found by: call frame info
 8  libpthread-2.17.so!start_thread + 0xc5
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce8f2da0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d2c60dd5
    Found by: call frame info
 9  libc-2.17.so!__clone + 0x6d
    rbx = 0x00007fb9ce8f3700   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce8f2e40   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce8f3700   rip = 0x00007fb9d298ab3d
    Found by: call frame info

Thread 2
 0  libpthread-2.17.so!__nanosleep_nocancel + 0x24
    rax = 0xfffffffffffffdfc   rdx = 0x0000000000000000
    rcx = 0xffffffffffffffff   rbx = 0x0000000003792300
    rsi = 0x0000000000000000   rdi = 0x00007fb9ce0f0cb0
    rbp = 0x00007fb9ce0f0d00   rsp = 0x00007fb9ce0f0ca0
     r8 = 0x00000000000003e8    r9 = 0x000000005bbd0a81
    r10 = 0x0000000000009f58   r11 = 0x0000000000000293
    r12 = 0x0000000000000000   r13 = 0x0000000000801000
    r14 = 0x0000000000000004   r15 = 0x00007fb9ce0f2700
    rip = 0x00007fb9d2c67eed
    Found by: given as instruction pointer in context
 1  libstdc++.so.6.0.20!std::this_thread::__sleep_for(std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) [thread.cc : 174 + 0xa]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f0d00
    rsp = 0x00007fb9ce0f0cb0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d3442bd7
    Found by: call frame info
 2  blusparkd!void std::this_thread::sleep_for<long, std::ratio<1l, 1000l> >(std::chrono::duration<long, std::ratio<1l, 1000l> > const&) [stl_iterator.h : 739 + 0x54]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f0d00
    rsp = 0x00007fb9ce0f0cd0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x0000000000537e24
    Found by: call frame info
 3  libbluspark.so!ibm_htap::engine::DBBackgroundThread::runThread() [format.h : 974 + 0x1209]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f1540
    rsp = 0x00007fb9ce0f0d10   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d8b8d449
    Found by: call frame info
 4  libbluspark.so!ibm_htap::engine::BaseThread::threadRunner(ibm_htap::engine::BaseThread*) [new_allocator.h : 81 + 0x4c]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f1d00
    rsp = 0x00007fb9ce0f1550   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d89731ec
    Found by: call frame info
 5  libbluspark.so!void std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)>::_M_invoke<0ul>(std::_Index_tuple<0ul>) [new_allocator.h : 81 + 0x45]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f1d30
    rsp = 0x00007fb9ce0f1d10   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d8b9a0c5
    Found by: call frame info
 6  libbluspark.so!std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)>::operator()() [new_allocator.h : 81 + 0x15]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f1d50
    rsp = 0x00007fb9ce0f1d40   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d8b9a075
    Found by: call frame info
 7  libbluspark.so!std::thread::_Impl<std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)> >::_M_run() [new_allocator.h : 81 + 0x19]
    rbx = 0x0000000003792300   rbp = 0x00007fb9ce0f1d70
    rsp = 0x00007fb9ce0f1d60   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d8b99e59
    Found by: call frame info
 8  libstdc++.so.6.0.20!execute_native_thread_routine [thread.cc : 84 + 0x3]
    rbx = 0x0000000003792300   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce0f1d80   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d3442c80
    Found by: call frame info
 9  libpthread-2.17.so!start_thread + 0xc5
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce0f1da0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d2c60dd5
    Found by: call frame info
10  libc-2.17.so!__clone + 0x6d
    rbx = 0x00007fb9ce0f2700   rbp = 0x0000000000000000
    rsp = 0x00007fb9ce0f1e40   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9ce0f2700   rip = 0x00007fb9d298ab3d
    Found by: call frame info

Thread 3
 0  libpthread-2.17.so!__nanosleep_nocancel + 0x24
    rax = 0xfffffffffffffdfc   rdx = 0x0000000000000000
    rcx = 0xffffffffffffffff   rbx = 0x00000000037930c0
    rsi = 0x0000000000000000   rdi = 0x00007fb9cd8efcb0
    rbp = 0x00007fb9cd8efd00   rsp = 0x00007fb9cd8efca0
     r8 = 0x00000000000003e8    r9 = 0x000000005bbd0a81
    r10 = 0x0000000000009f58   r11 = 0x0000000000000293
    r12 = 0x0000000000000000   r13 = 0x0000000000801000
    r14 = 0x0000000000000004   r15 = 0x00007fb9cd8f1700
    rip = 0x00007fb9d2c67eed
    Found by: given as instruction pointer in context
 1  libstdc++.so.6.0.20!std::this_thread::__sleep_for(std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) [thread.cc : 174 + 0xa]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8efd00
    rsp = 0x00007fb9cd8efcb0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d3442bd7
    Found by: call frame info
 2  blusparkd!void std::this_thread::sleep_for<long, std::ratio<1l, 1000l> >(std::chrono::duration<long, std::ratio<1l, 1000l> > const&) [stl_iterator.h : 739 + 0x54]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8efd00
    rsp = 0x00007fb9cd8efcd0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x0000000000537e24
    Found by: call frame info
 3  libbluspark.so!ibm_htap::engine::DBBackgroundThread::runThread() [format.h : 974 + 0x1209]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8f0540
    rsp = 0x00007fb9cd8efd10   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d8b8d449
    Found by: call frame info
 4  libbluspark.so!ibm_htap::engine::BaseThread::threadRunner(ibm_htap::engine::BaseThread*) [new_allocator.h : 81 + 0x4c]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8f0d00
    rsp = 0x00007fb9cd8f0550   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d89731ec
    Found by: call frame info
 5  libbluspark.so!void std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)>::_M_invoke<0ul>(std::_Index_tuple<0ul>) [new_allocator.h : 81 + 0x45]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8f0d30
    rsp = 0x00007fb9cd8f0d10   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d8b9a0c5
    Found by: call frame info
 6  libbluspark.so!std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)>::operator()() [new_allocator.h : 81 + 0x15]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8f0d50
    rsp = 0x00007fb9cd8f0d40   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d8b9a075
    Found by: call frame info
 7  libbluspark.so!std::thread::_Impl<std::_Bind_simple<void (*(ibm_htap::engine::ConsolidatorThread*))(ibm_htap::engine::BaseThread*)> >::_M_run() [new_allocator.h : 81 + 0x19]
    rbx = 0x00000000037930c0   rbp = 0x00007fb9cd8f0d70
    rsp = 0x00007fb9cd8f0d60   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d8b99e59
    Found by: call frame info
 8  libstdc++.so.6.0.20!execute_native_thread_routine [thread.cc : 84 + 0x3]
    rbx = 0x00000000037930c0   rbp = 0x0000000000000000
    rsp = 0x00007fb9cd8f0d80   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d3442c80
    Found by: call frame info
 9  libpthread-2.17.so!start_thread + 0xc5
    rbx = 0x0000000000000000   rbp = 0x0000000000000000
    rsp = 0x00007fb9cd8f0da0   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d2c60dd5
    Found by: call frame info
10  libc-2.17.so!__clone + 0x6d
    rbx = 0x00007fb9cd8f1700   rbp = 0x0000000000000000
    rsp = 0x00007fb9cd8f0e40   r12 = 0x0000000000000000
    r13 = 0x0000000000801000   r14 = 0x0000000000000004
    r15 = 0x00007fb9cd8f1700   rip = 0x00007fb9d298ab3d
    Found by: call frame info

On Friday, August 10, 2018 at 3:07:17 AM UTC+5:30, dftx...@free.fr wrote:

Vaibhav Sood

unread,
Jan 23, 2019, 6:21:58 AM1/23/19
to Shawn Anastasio, Chromium-dev, j...@jaesharp.com, tpea...@raptorengineering.com
Hi Shawn,

Thanks! Will appreciate any help on this

Thanks,
Vaibhav

On Tue, Jan 22, 2019 at 11:24 PM Shawn Anastasio <shawnan...@yahoo.com> wrote:
Hi Vaibhav,

I did not implement stack walking or CFI in the patch. I was mainly aiming for 100% unit test completion which evidently doesn't include tests for this functionality (or if it does, it's ifdef'd out and I missed it).

I'll try to contribute to the review process for your patch. Also, I'll submit my current ppc64le enablement patches to the gerrit as well.

Thanks,
Shawn

Priya Seth Gawade

unread,
May 15, 2019, 10:49:42 AM5/15/19
to Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, tpea...@raptorengineering.com, vaibha...@gmail.com
Hi Shawn,

Just for some context - 

We have previously built chromium on ppc64le (reference posts from Sandip Giri from November) using the instructions here

Thanks for those!! However, running into a few issues now.

As I go through the instructions, I also note that some of the changes / patches have already been up-streamed to the respective repositories and hence patches are no longer required? So, was wondering, if there is an updated set of patches available somewhere or are all the changes already available up-stream?

If not, we would like to assist in the validation and community acceptance, community CI setup etc. for ppc64le, if required, since this is an on-going requirement for us and it would really help to everything work smoothly.

One more question was if you are aware of an attempt to build chromium on Centos / RHEL on Power has been made till now?

Thanks in advance!

Regards,
Priya

dftxbs3e dftxbs3e

unread,
May 15, 2019, 5:17:00 PM5/15/19
to Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, tpea...@raptorengineering.com, vaibha...@gmail.com

Hello,

I currently maintain an ungoogled-chromium ppc64le fork with CI at https://gitlab.com/lle-bout/ungoogled-chromium

It's based on the patches made by Shawn at https://github.com/shawnanastasio/chromium_power which is where they get the most up to date.

Feel free to inspect how everything of that is done there.

Building Chromium on other systems than Ubuntu is largely unsupported by Google so that's why we preferably go for what Google recommends. Chromium is a fast moving code base and you're better off going on the same tracks as Google unless you want significant maintenance cost. I have made Docker images at https://gitlab.com/lle-bout/ungoogled-chromium-ci-docker-image for building ungoogled-chromium (should not be too different for Chromium), so you can easily spin these up on RHEL / Centos if required.

Also, out of curiosity, what is this a requirement for? We're very excited within the community to see progress on the desktop side of ppc64le and are curious to know if there's corporate(?) involvement in this!

Thank you

Priya Seth Gawade

unread,
May 17, 2019, 1:18:59 PM5/17/19
to Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, tpea...@raptorengineering.com, vaibha...@gmail.com
Hi dftxbs3e and Shawn,

Thanks for the quick response and pointers to the updated patches, latest instructions and the docker image.

We are trying to follow the updated build steps and currently running into issues at the below step (which is failing)
./chromium/scripts/build_ffmpeg.py linux ppc64
We are looking into this currently, but any insights that you might already have would be useful!

Regarding your question on the requirement / use case, the broader mission of my team is to build the overall open source Ecosystem for the Power (ppc64le) platform. We have often run into a requirement for chromium when working with NPMs which tend to use chromium (generally in headless mode) for running the browser tests, and then there have been a couple of times where we have run across a requirement for electron which is tightly coupled with chromium. 

We are very interested in seeing support for Power added to the community and as I mentioned earlier would like to contribute towards that.

On the CI setup, Travis on Power is what would probably be the easiest for us to help with. The other option would be use the shared environment at https://powerci.osuosl.org/. Basically, IBM provides OSU with Power servers for community use for both development and / or CI. More information on how these can be obtained is available here

Please let me know if you need more details on this.

Thanks!

Regards,
Priya

dftxbs3e dftxbs3e

unread,
May 17, 2019, 2:38:56 PM5/17/19
to Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, tpea...@raptorengineering.com, vaibha...@gmail.com
Hello,

I can't help much without build logs, if you want assistance there's an active ppc community at #talos-workstation on Freenode IRC, Shawn as well as myself are active there with ppc eco-system work, it would be much easier than through email-like messaging.

So as far as I understand, you are working at the Oregon State University either as a student or else?

See you there on IRC.

Timothy Pearson

unread,
May 17, 2019, 4:41:19 PM5/17/19
to Priya Seth Gawade, Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, vaibha...@gmail.com
We have also offered free long-term access to a POWER9 system via Integricloud for CI / development, if it helps any. The same offer is also available to Firefox and any of the other browser projects that people expect to see on ppc64 boxes. The main restriction on the offer is that there has to be actual movement toward upstreaming the ppc64 support -- the goal is to make ppc64 a proper supported target in the upstream project, not just a downstream fork.
>> <https://www.google.com/url?q=https%3A%2F%2Fgitlab.com%2Flle-bout%2Fungoogled-chromium-ci-docker-image&sa=D&sntz=1&usg=AFQjCNHxNXrycB3HBdU6e5Qdse129cOWbA>
>>>>> wiki/Porting/Chromium <https://wiki.raptorcs.com/wiki/Porting/Chromium>

Shawn

unread,
May 17, 2019, 7:03:38 PM5/17/19
to Chromium-dev, shawnan...@yahoo.com, j...@jaesharp.com, tpea...@raptorengineering.com, vaibha...@gmail.com
Hi,

Build logs will definitely be required to properly diagnose any issues you're running into.
I should note though that as long as ffmpeg/0001-Add-support-for-ppc64.patch applied successfully, the build_ffmpeg.py script should run fine.
Perhaps your build environment is missing some dependencies?

In any case, as dftxbs3e mentioned,  #talos-workstation on Freenode IRC is a great place to get a hold of me and other ppc developers.

- Shawn

Kaushal Patel

unread,
Jul 10, 2019, 12:40:27 AM7/10/19
to Chromium-dev, shawnan...@yahoo.com, "priy...@gmail.com
Hi,

I was trying to build chromium on ppc64le machine and currently one patch was failing due to update in chromium source code.
The failing patch is "crashpad/0001-Implement-support-for-PPC64-on-Linux.patch" and it is failing for "util/misc/capture_context_test.cc" file.
Please update it.

Thanks in advance!

Regards,
Kaushal

tpearson.raptor

unread,
Aug 31, 2019, 8:50:32 PM8/31/19
to Chromium-dev, shawnan...@yahoo.com, "priy...@gmail.com
With POWER now an open ISA alongside RISC-V and MIPS, is there any way we can get support for ppc64le merged into mainline Chromium?  Not only are there many hundreds of desktop POWER systems in the wild in active use right now, today, but there are new non-IBM desktop POWER products under active development.

Not having its flagship browser, Chromium, available for these systems, despite having a port literally done for free over a year ago by the larger desktop POWER community, is frankly a complete embarrassment for a company like Google that otherwise seems to embrace open source concepts and technologies.  Furthermore the reasons given for not accepting e.g. sandbox code are weak; if the sandbox is that brittle it should have been rewritten some time in the past, not be used as an excuse not to merge proper support for up and coming technologies like POWER.

What do we need to do to move this forward toward merge?
Reply all
Reply to author
Forward
0 new messages