Places Taxonomy Import via Command Line

294 views
Skip to first unread message

prathmes...@gmail.com

unread,
Aug 29, 2013, 5:54:20 PM8/29/13
to ica-ato...@googlegroups.com
We here at Dumbarton Oaks are trying to import a XML file with lots of entries for Places taxonomies. We tried importing them through the Front-End using the SKOS import feature. The import gets cut short after the 1,435th term (of 1,983 terms) . Further research in the Debug mode shows that its because of the Php execution limits. I did the necessary changes in the php.ini file and tried removing any restrictions on the memory limit. Still no luck , so as suggested decided to attempt the Import via CLI.
I went through $ php symfony list to find the right command to do this import. Can anyone tell me what is the right symfony command to Import this xml file in Places Taxonomy.

FYI,
I tried php symfony import:bulk --v /tmp/skos_sites_final_fixed-1.xml
Importing 1 files from /tmp/skos_sites_final_fixed-1.xml (indexing is ENABLED) ...
Successfully imported 1 XML/CSV files in 475.58 s. 360241800 bytes used.


But this doesnt import any places and the Places Taxonomy is empty when checked.

Thanks !

-Prathmesh

Dan Gillean

unread,
Aug 29, 2013, 8:24:30 PM8/29/13
to ica-ato...@googlegroups.com
Hi Prathmesh,

In trying to find an answer to your question, I came across an orphaned issue ticket for a bug we identified some time ago but had not yet addressed: https://projects.artefactual.com/issues/4247

In essence, I suggest checking the other taxonomies, as your terms may have successfully imported, but as subjects rather than places - this might explain the successful import message and the empty places taxonomy.

Let us know what you find! In the meantime, I have reassigned the issue ticket, and included it in our 1.4 release. Keep an eye on the issue ticket for any updates on our end (I'll try to remember to post back here as well), and if you end up finding more information about the nature of the bug or how you've solved it before we do, please feel free to share.

Hope that helps,


Dan Gillean
AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056


--
You received this message because you are subscribed to the Google Groups "ICA-AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com.
To post to this group, send email to ica-ato...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ica-atom-users/11ecd1e3-6159-4557-8ae8-542fefc19b0c%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

prathmes...@gmail.com

unread,
Aug 30, 2013, 11:00:16 AM8/30/13
to ica-ato...@googlegroups.com
Hi Dan,

    Thanks for the quick response! As you suspected the terms have ended up in Subject tax instead of Places. Thanks for opening the Ticket, I'll keep a watch on it and If I find something in the meantime, I'll post it here.
I re-populated and optimized the Search but I am unable to find this terms by searching them, I was wondering why is that so even when I can clearly see them in the Subjects Tax page ?
I also accidentally imported the file 4 times while trying to find a solution. As a result , I have duplicate entries in subject tax and wanted to know whats the way to delete this duplicate entries ( for future reference) ?

Best,
Prathmesh

Jessica Bushey

unread,
Aug 30, 2013, 12:19:47 PM8/30/13
to ica-ato...@googlegroups.com
Dear Prathmesh,

This suggestion on how to delete *all* subject terms OR deleting subjects terms created after a certain date/time, might be useful to you: https://groups.google.com/d/msg/ica-atom-users/nSSMF1xdOL4/gvczswu5sMAJ.

Let me know how that goes.

Jessica


---
Jessica Bushey, MAS
ICA-AtoM Product Manager
Systems Analyst
jes...@artefactual.com

Artefactual Systems Inc.
www.artefactual.com







Anne-Marie

unread,
Aug 30, 2013, 5:24:11 PM8/30/13
to ica-ato...@googlegroups.com
Hello, Artefactual!

We're going to try breaking up the XML file to work around the file size restrictions on the browser-based import. Are you aware of how large of a SKOS/taxonomy file can successfully be imported? My work back in 2011 with 1.1 indicated that description files up to 2 MB could be successfully imported based on our server settings (1024 MB).

Am attaching the article in which my tech and I documented this process for ICCROM's project (see page 83 under "XML import") and sharing this with Prathmesh as well. 

Previous discussions on these problems were addressed in this 2011 forum post, https://groups.google.com/d/msg/ica-atom-users/99HEwYNlvv8/ys939EhkKmEJ

Thanks!

Anne-Marie H. Viola
Metadata & Cataloguing Specialist,
Image Collections and Fieldwork Archives (ICFA)
Dumbarton Oaks Research Library and Collection
1703 32nd Street, NW Washington, DC 20007
Comma_2011_2_06_Caravaca_Viola.pdf

Jesús García Crespo

unread,
Sep 4, 2013, 6:05:45 PM9/4/13
to ica-ato...@googlegroups.com
Hi Anne-Marie,

On Fri, Aug 30, 2013 at 2:24 PM, Anne-Marie <amhv...@gmail.com> wrote:
We're going to try breaking up the XML file to work around the file size restrictions on the browser-based import. Are you aware of how large of a SKOS/taxonomy file can successfully be imported? My work back in 2011 with 1.1 indicated that description files up to 2 MB could be successfully imported based on our server settings (1024 MB).

We haven't done any scalability testing on the SKOS import as far as I can remember. But as far as I know, you should be able to import larger documents with these settings. Memory is cheap, so you can always plug more memory as needed.

But I keep thinking that the command line will be a more reliable solution. FYI, we have already a developer looking at #4247.

Regards,

--
Jesús García Crespo,
Software Engineer, Artefactual Systems Inc.
http://www.artefactual.com | +1.604.527.2056

Anne-Marie

unread,
Sep 9, 2013, 4:36:02 PM9/9/13
to ica-ato...@googlegroups.com, je...@artefactual.com, meng...@doaks.org
Hi, guys,

I broke up that >1 MB SKOS XML file into three parts (each less than 600 KB) making sure to keep all children included in the same file as their parent term for all city, region and country terms that we want in our Places taxonomy. I was able to successfully import each file through the interface functionality, which created a navigable taxonomy of a depth of up to three levels (country/region/city). (Note: The hierarchy is either country/region/city or just country/city.) 

However I am getting a 500 Internal Server Error/Zend_Acl_Exception message (see attached screenshot). This occurs whenever I click on either a city term within a region (so a third-level term) or a city term that is an "only child" (lone city associated with a country). The next line of the error message notes that the parent resource id for the parent term (region or country) does not exist (despite having been able to successfully view that very term).

When browsing the taxonomy "context tree," I notice that if a carrot appears next to a parent term, then I can always access its child terms successfully. But if a carrot does not appear, I can see the links to those terms only in the parent term record and receive an error message when I click on those links. 

Any ideas?

Thanks,
Anne-Marie H. Viola
Metadata & Cataloguing Specialist,
Image Collections and Fieldwork Archives (ICFA)
Dumbarton Oaks Research Library and Collection
1703 32nd Street, NW Washington, DC 20007

SKOSimportServerError.jpg

David Juhasz

unread,
Sep 13, 2013, 3:32:57 PM9/13/13
to ica-ato...@googlegroups.com, Jesús García Crespo, menganep
Hi Anne-Marie,

Usually that Zend_Acl_Exception error message means that the database hierarchy is corrupt.  Usually hierarchy corruption occurs when an import does not run to completion.  I've added the problem and our suggested solution to our FAQ, as it has come up a number of times in the AtoM community:

https://www.qubit-toolkit.org/wiki/Frequently_asked_questions#Why_do_I_get_a_.22Zend_ACL_Exception:_Parent_Resource_id:_.27XXXXX.27_does_not_exist.22_error.3F

Please let us know if rebuilding the hierarchy resolves your issue.



Cheers,
David

--

David Juhasz Director, Technical Services Artefactual Systems Inc. www.artefactual.com


--
You received this message because you are subscribed to the Google Groups "ICA-AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com.
To post to this group, send email to ica-ato...@googlegroups.com.

Anne-Marie

unread,
Sep 16, 2013, 4:15:40 PM9/16/13
to ica-ato...@googlegroups.com, Jesús García Crespo, menganep
Thank you, David. That worked.

On a related note there appears to be a problem with the URL generated for the term record by the system. The system is appending a dash and a number to the term (e.g. - "http://54.225.221.244/icaatom/index.php/italy-3;term"). It is not entirely clear if the number is in relation to the instance of the term record (i.e. - two previous imports of the term "Italy"?), although we are seeing significantly varying numbers (e.g. - as high as 9) and have only done a handful of import attempts.

Is it possible to either reset this functionality and/or otherwise modify these URL's? In hopes of creating URI's or at least user-friendly URL's we would prefer as clean of strings as possible. In the past when we had to clear the Places taxonomies after an import, Prathmesh did so by deleting all the terms in the Places taxonomy from the "terms_i18n" table only. Is there are any other tables where these terms are stored in addition to "terms_i18n", which might still be there from previous imports and which need to deleted?

Thank you,

Anne-Marie H. Viola
Metadata & Cataloguing Specialist,
Image Collections and Fieldwork Archives (ICFA)
Dumbarton Oaks Research Library and Collection
1703 32nd Street, NW Washington, DC 20007
http://www.doaks.org/library-archives/icfa

David Juhasz

unread,
Sep 16, 2013, 6:29:42 PM9/16/13
to ica-ato...@googlegroups.com, menganep
Hi Anne-Marie,

The "slug" (e.g. "italy-3") used for URLs in AtoM must be unique.  When a pre-existing resource (e.g. archival description, authority record, archival institution, taxonomy, term) is already using a given slug (e.g. "italy") then the system enforces uniqueness by adding a numeric suffix (e.g. "italy-2").   The system will find a unique slug by searching the database for the "maximum" value of any integer suffix for the current slug, then increment the found suffix by one. For example if "italy", "italy-2" and "italy-9" are slugs already in the database, a new slug for the place term "Italy" will become "italy-10".

You can "reset" the counter on this system by directly editing the slug_i18n table to change the value of the current maximum integer suffix for a given slug. In the example above this would mean replacing the "italy-9" slug_i18n value with "italy-3", or a completely different value (e.g. "italy-ny-usa").

You can also change the generated slug by directly modifying the MySQL database values in the "slug_i18n" table.  There is an outstanding issue to add the ability to edit slug values using the AtoM web UI <https://projects.artefactual.com/issues/3909> but so far we haven't had the developer time or funding to add this functionality.

Best regards,
David

--

David Juhasz Director, Technical Services Artefactual Systems Inc. www.artefactual.com

prathmes...@gmail.com

unread,
Sep 17, 2013, 11:18:21 AM9/17/13
to ica-ato...@googlegroups.com, menganep, Vio...@doaks.org
Hi David,
 
   I am looking into this and will follow up with you with more ques later in the day. But can you please quickly answer one thing, I was referring to the Db structure of my atom installation and there seems to be no "slug_i18n" table . There is a "slug" table where I was able to locate all the duplicate terms of "italy" as you explained. I have total of 53 tables in my installed Db , is there something I'm missing ??

Best,
Prathmesh


David Juhasz

unread,
Sep 17, 2013, 12:23:25 PM9/17/13
to ica-ato...@googlegroups.com, menganep, Anne-Marie Viola
Hi Prathmesh,

Sorry, that was my mistake.  You are correct that there is no "slug_i18n" table - please substitute "slug" table for all instances.

Cheers,
David

--

David Juhasz Director, Technical Services Artefactual Systems Inc. www.artefactual.com


--
You received this message because you are subscribed to the Google Groups "ICA-AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com.
To post to this group, send email to ica-ato...@googlegroups.com.

Mengane, Prathmesh

unread,
Sep 17, 2013, 2:02:30 PM9/17/13
to David Juhasz, ica-ato...@googlegroups.com, Viola, Anne-Marie
Hi David,

So the problem is we have hundreds of terms which is having this slug issue & this is because we have attempted the several imports of the same place taxonomies for various reasons over period of time.
Thus, its not manually possible through the mysql backend to alter the slugs for all this hundreds of terms or delete this terms altogether from the slug table.
Unless there is a query you know which can delete only this places taxonomies entries from the slug table ?

I tried DELETE FROM object WHERE id IN (SELECT id FROM term WHERE taxonomy_id = 42);
Which deletes at least one of the entry for "italy" from the slug table, but there are other duplicate terms for italy there.
I am confused as if why the other italy terms are not deleted from the slug table after the above query, don't they have the same taxonomy_id (ie 42) OR is there a way find out the taxonomy id of the remaining duplicate terms ??

Sorry if this sounds confusing , let me know if it does and I'll try and re-phrase it.

Best,
Prathmesh

From: David Juhasz <da...@artefactual.com<mailto:da...@artefactual.com>>
Date: Tuesday, September 17, 2013 12:23 PM
To: "ica-ato...@googlegroups.com<mailto:ica-ato...@googlegroups.com>" <ica-ato...@googlegroups.com<mailto:ica-ato...@googlegroups.com>>
Cc: Prathmesh Mengane <Meng...@doaks.org<mailto:Meng...@doaks.org>>, "Viola, Anne-Marie" <Vio...@doaks.org<mailto:Vio...@doaks.org>>
Subject: Re: [ica-atom-users] Places Taxonomy Import via Command Line

Hi Prathmesh,

Sorry, that was my mistake. You are correct that there is no "slug_i18n" table - please substitute "slug" table for all instances.

Cheers,
David


--

David Juhasz
Director, Technical Services

Artefactual Systems Inc.
www.artefactual.com<http://www.artefactual.com>


On Tue, Sep 17, 2013 at 8:18 AM, <prathmes...@gmail.com<mailto:prathmes...@gmail.com>> wrote:
Hi David,

I am looking into this and will follow up with you with more ques later in the day. But can you please quickly answer one thing, I was referring to the Db structure of my atom installation and there seems to be no "slug_i18n" table . There is a "slug" table where I was able to locate all the duplicate terms of "italy" as you explained. I have total of 53 tables in my installed Db , is there something I'm missing ??

Best,
Prathmesh



--
You received this message because you are subscribed to the Google Groups "ICA-AtoM Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com<mailto:ica-atom-users%2Bunsu...@googlegroups.com>.
To post to this group, send email to ica-ato...@googlegroups.com<mailto:ica-ato...@googlegroups.com>.

Hutchinson, Tim

unread,
Sep 19, 2013, 2:01:41 PM9/19/13
to ica-ato...@googlegroups.com
Hi Prathmesh,

There's no doubt SQL you could use for this, but I would suggest tracing back a few of the offending slugs by looking at the tables slug, object, object_term_relation, and finally term, matching on the relevant id fields (should be clear from the field names).

But looking at the earlier part of the thread, it looks like you are also dealing with places that were imported as subjects? Taxonomy id 42 is for places; it looks like subjects would be 35.

Tim

Tim Hutchinson
Head, University Archives & Special Collections
University Library, University of Saskatchewan
Tel: (306) 966-6028  Fax: (306) 966-6040
Email: tim.hut...@usask.ca
Web: http://library.usask.ca/archives/
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com.
To post to this group, send email to ica-ato...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ica-atom-users/CE5E0DDE.2F04%25MenganeP%40doaks.org.

Anne-Marie

unread,
May 22, 2014, 4:41:44 PM5/22/14
to ica-ato...@googlegroups.com
In planning for our upgrade to 2.0 in the coming months, I am revisiting this issue as we never fully imported our SKOS Places taxonomy in 1.3. With that said, in what order should we plan our imports -- Places first, then port over the existing description data? Will the existing terms in our cataloging recognize their equivalents, or will we run into the same slug issue described above?

Planning on exploring this further with testing on dev but figured I would survey the forum first!

Thanks, as always,
Anne-Marie H. Viola
Metadata & Cataloguing Specialist,
Image Collections and Fieldwork Archives (ICFA)
Dumbarton Oaks Research Library and Collection
1703 32nd Street, NW Washington, DC 20007
viola[at]doaks.org
To unsubscribe from this group and stop receiving emails from it, send an email to ica-atom-user...@googlegroups.com<mailto:ica-atom-users%2Bunsubscribe@googlegroups.com>.

Dan Gillean

unread,
May 22, 2014, 7:05:17 PM5/22/14
to ica-ato...@googlegroups.com
Hi Anne-Marie,

My understanding is that your slug issue was caused by importing the same place terms multiple times without first deleting the previous set, clearing the cache, and rebuilding the search index. If you are avoiding this, I'm hopeful that you won't encounter the same issues.

The issue with terms importing to the wrong taxonomy has been solved with a tweak to the user interface - users can now find a link directly to the SKOS import page on the general import - by following that link, you can specify which taxonomy you are targeting with your import. See: https://www.accesstomemory.org/docs/2.0/user-manual/import-export/import-descriptions-terms/#import-skos-file

If you import the terms first, and then your descriptions with access points, AtoM should link the access points to the existing terms. As ever, I'd recommend testing this on your development installation first - and let us know how it goes!

Regards,

Dan Gillean, MAS, MLIS

AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056

Anne-Marie Viola

unread,
May 23, 2014, 11:05:29 AM5/23/14
to ica-ato...@googlegroups.com
Thanks for the quick reply, Dan. I noted the import functionality on another post and will DEFINITELY test this out ahead of time on Dev. I imagine we'll want to make sure that our existing cataloging is using terms that are the original slugs (i.e. - NOT italy-5). If there are such slugs, what do you recommend to ensure a correct match up on import: attempting to change in the existing database or in the new taxonomy?


--
You received this message because you are subscribed to a topic in the Google Groups "ICA-AtoM Users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ica-atom-users/PrI5WiMVmX8/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ica-atom-user...@googlegroups.com.

To post to this group, send email to ica-ato...@googlegroups.com.
Visit this group at http://groups.google.com/group/ica-atom-users.

Dan Gillean

unread,
May 23, 2014, 3:10:02 PM5/23/14
to ica-ato...@googlegroups.com
Hi Anne-Marie,

All the slugs in AtoM must be unique - so AtoM will only append a number (e.g. italy-5) if the root slug (e.g. in this case, italy) is already in use. If you have removed all your duplicates, or removed all terms and re-indexed and then reimported ONCE, you shouldn't have the slug problem.

If however, you are trying to work with terms already in your database rather than recreating them, you certainly could try editing them in the DB. Naturally we don't really recommend this approach, but it could certainly work - and there's a constraint in the database schema enforcing the "Unique" flag on the slug.slug column so that should prevent you from accidentally assigning the same slug twice and causing data problems. I would strongly recommend that you back up all your data before manipulating records directly in the database!

I'm pretty sure that terms/access points are matched on the title/name, so ultimately, the URL shouldn't matter for matching up access points in your descriptions to existing terms in your Places taxonomy on import.

Let us know how it goes,

Dan Gillean, MAS, MLIS
AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056
@accesstomemory


Anne-Marie

unread,
Aug 18, 2014, 1:24:42 PM8/18/14
to ica-ato...@googlegroups.com
After multiple issues around import and such over the last few months, we have finally tested importing existing cataloging into a system with an established Places taxonomy. We performed this test in a dev instance of 2.0, importing first our comprehensive Places taxonomy in SKOS RDF XML (1983 terms) and then our existing cataloging (which includes 92 Place terms), by reimposing the database from our 1.3 instance. After doing so, the Places taxonomy was "reset" to the smaller of the two taxonomies and only shows 92 terms.

I can try importing our 37 collections one at a time to see what happens, but I am concerned about issues of scale. Any suggestions? Will report back on the results of the XML import.

Best,
Anne-Marie

Dan Gillean

unread,
Aug 18, 2014, 4:36:50 PM8/18/14
to ica-ato...@googlegroups.com
Hi Anne-Marie,

Just to clarify: are you running these imports via the GUI, or via the command-line?

If you are running them via the command line using the import:bulk task, are you using the --taxonomy flag to point the import to the correct taxonomy? For places, the taxonomy flag would be something like this:

php symfony import:bulk --taxonomy="42" /path/to/mySKOSfiles

Don't forget to clear the cache and rebuild the search index after!

php symfony cc
php symfony search:populate

I'm also not sure what you mean by "reimposing the database from our 1.3 instance" but there have likely been changes in the database schema between 1.3 and our current development branch, so I wouldn't recommend this - you should instead be using either a clean database (by following our installation instructions), or you should make sure that all upgrade instructions in our documentation have been followed.

Before you retry the above, you might want to make sure your DB schema is up-to-date:

php symfony tools:upgrade-sql

Let us know if that helps!

Regards,

Dan Gillean
, MAS, MLIS

AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056
@accesstomemory

prathmes...@gmail.com

unread,
Aug 18, 2014, 4:53:10 PM8/18/14
to ica-ato...@googlegroups.com
Hi Dan,

   I would like to clarify to clear the confusion with our procedure here.  We are not importing a Database from 1.3 instance for this feature.

I had already imported the database previously from the 1.3 instance to the 2.0 instance following all the steps from the Upgrade instructions. The import & upgrade were both successful on this instance, and this was done sometime back. For the SKOS import test, I first backed up the existing 2.0 database and then wiped the database clean. Then I imported the SKOS places using the Front End SKOS import feature. And then I re-instated the backed up Database to check if the SKOS imported places automatically get assigned to the collections Or not. And as you learned from Anne-Marie, not only they don't automatically link to the right collections but also the places imported get replaced by the backed up database place entries.

Best,
Prathmesh Mengane,
Database and CMS Developer,
Dumbarton Oaks, Washington D.C.

Dan Gillean

unread,
Aug 18, 2014, 5:15:49 PM8/18/14
to ica-ato...@googlegroups.com
Hi Prathmesh,

Unfortunately, if I am understanding you correctly, there is no way that this will work.

Here's a link to the first resource I found on working with MySQL dumps: http://www.thegeekstuff.com/2008/09/backup-and-restore-mysql-database-using-mysqldump/

In the second line, it describes what happens with a traditional sql dump: "It creates a *.sql file with DROP table, CREATE table and INSERT into sql-statements of the source database."

The first thing that happens is a DROP table command - meaning your previous data will be gone when you load the backup.

Even if you were to tailor your dump to skip the DROP table command, there is nothing in MySQL that will automatically know how and when and where to merge your data coherently - generally this is up to the application. Creating a framework to do so in AtoM could be inordinately complex, and not something we would take on lightly - there are too many use cases and edge cases to make something like this work for a broad segment of users across a variety of needs.

Here is the MySQL documentation for the sqldump task in MySQL 5.5: https://dev.mysql.com/doc/refman/5.5/en/mysqldump.html

I am not a developer, but if you feel confident working with SQL, you may be able to use some of the many options to tailor how you both create and then load your sql dumps. Note that AtoM itself will still have no idea that you have merged the data, so imagine you would need to restart all services, clear the cache, rebuild the search index, etc, after any attempt to load and merge a dump.

Best of luck,

Dan Gillean, MAS, MLIS
AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056
@accesstomemory


Anne-Marie Viola

unread,
Aug 18, 2014, 5:25:23 PM8/18/14
to ica-ato...@googlegroups.com
Dan,

Do you instead recommend importing the records as XML?

Thanks for the quick feedback,


Dan Gillean

unread,
Aug 18, 2014, 6:18:31 PM8/18/14
to ica-ato...@googlegroups.com
Hi Anne-Marie,

Sorry if my last post was unclear :)

For the large SKOS file, I would definitely recommend using the command-line, and not the user interface. You may want to use the CLI for both, to be certain.

My last email was more in response to this in Prathmesh's post:

And then I re-instated the backed up Database to check if the SKOS imported places automatically get assigned to the collections Or not. And as you learned from Anne-Marie, not only they don't automatically link to the right collections but also the places imported get replaced by the backed up database place entries.

It is definitely wise to back up your DB before trying to import. However, if the goal is to import and have the import live with the data, then there is no need to wipe your DB first, then try to load the backup on top of the import. I would instead suggest:
  1. Back up your DB
  2. Run the import command similar to how I demonstrated in my previous post, using the --taxonomy flag to point the SKOS file to the correct (Places) taxonomy.
  3. If it worked, great!
  4. If it didn't work, you have a backup you can now load, so you can troubleshoot and then try again.

Another thing that occurred to me: I believe that the --taxonomy flag has been added AFTER the 2.0.1 release - meaning it is not currently available in a public release. If you are testing with a development branch, you can have Prathmesh track the qa/2.1.x branch - it is the branch we are currently testing internally for our 2.1 release - mostly stable, though not officially a stable public release yet. Alternately, if you'd like to use this option, you might wish to wait until the beginning of September, when we expect the 2.1 release to be publicly available.

Otherwise, if you intend to use the GUI, then I do recommend you break up your imports into far smaller chunks. Make sure you are following the import instructions here, to make sure they end up in the correct taxonomy.

Cheers,

 

Dan Gillean, MAS, MLIS
AtoM Product Manager / Systems Analyst,
Artefactual Systems, Inc.
604-527-2056
@accesstomemory


prathmes...@gmail.com

unread,
Aug 19, 2014, 9:34:29 AM8/19/14
to ica-ato...@googlegroups.com
Thanks Dan,

  My current Atom Development Instance is not exactly tracking qa/2.1.x branch, due to some GitHub root permission issue on my side. I'll sort that and create a separate forum topic if need be.

But I've applied the fix from your 2.1.x branch that allows me to import from CLI using the --taxonomy flag. Fix being : https://github.com/artefactual/atom/commit/e393fc8a6e977a8b554112b644a1e3b92c9886bc. So I'm able to import large files from the CLI.

We will follow your steps of importing the SKOS file from CLI on the already existing collections, in the Database.


Best,
Prathmesh Mengane,
Database and CMS Developer,
Dumbarton Oaks, Washington D.C.


Reply all
Reply to author
Forward
0 new messages