Just a quick follow up on this:
If you have jobs stuck in the queue, there are a couple ways you can clear these on the backend.
First, there is a command that will terminate all running jobs:
Keep in mind that this will kill ALL jobs, so if there are some in the queue you actually want, you'll need to restart the process afterwards (i.e. redo your import, etc).
See:
Additionally, we've also provided a SQL query that can be used to terminate individual jobs. See:
Radda is correct to point out that using the command-line and disabling the nested set build will definitely improve the speed. On the command-line, indexing as the task progresses is also disabled by default (while in the user interface, each row triggers an update to both the nested set and the index) - it is generally these two processes that can make additions to large hierarchies slow. So long as you remember to run the tasks to rebuild the nested set and repopulate the search index after your import, using the command-line for these will likely vastly improve the process - as will upgrading to 2.6 where, as Radda mentioned, we've added a number of performance optimizations.
Cheers,