GoodMorning/Afternoon,
We are in the process of becoming SQF certified at our Storage and Distribution center which repacks many items and then reuses the empty boxes. My biggest issue is that they do not remove the prior label and sometimes it shows underneath the new label or the new label peels.
My first question is should they be required to remove the original label prior to shipping? Could this be a liability if the new label peels off?
My second question is we are dealing with multiple allergens so is it acceptable to repack an item into a box that previously contained a different allergen? For example repacking cottage cheese in a box that had tuna.
All items being repacked are sealed with no open food contact.
I want to apologize in advance if there is already any threads already available on this topic I looked but could not find the specific answer I was referring to. This topic has been bothering me ever since we started moving forward with wanting to become SQF certified.
Thank you everyone for your time and feedback!
I appreciate every one of you!!!
Sincerely,
Chris
I would have ALL old labeling removed..............it isn't necessarily a requirement, but it is inviting trouble with your customers at receipt. Yes, if the new label peels off, you now have a mis-labelled product and it is a liability.
If you are only repacking sealed containers into boxes that are completely intact, I think you should be fine as your not repacking the retail containers. Obviously if something has leaked or the carton is damaged, it should not be reused.
The labels removed are the original product/manufacture labels or is it some of your company's secondary shipping label? if they are just your facility's labels, while the original product labels are intact, it can work to remove the labels and place a new one.
However, if these are original product labels, it can create a (bigger) traceability issues on your end. Usually, in a storage and distribution setting, containers are sold as a bulk units/multi packs and the original box label contains all the necessary information.
It's just bad practice to not remove old labels, for lots of reasons, not the least of which is FIFO issues that could arise and the facility is left with expired product because the new label came off and they are looking at the original label which is A) different product that it's contents and B) has a different BB date
I disagree with repacking in boxes with different allergen matrices - if your label peels off, you have a potential labeling violation on your end. Also - how do you know what the packaging has been exposed to at the initial manufacturing facility, especially if that facility only deals with one type of allergen?
I agree that we should be minimally be removing the original label if we want to reuse the original box but the more I think about it and from reading your comments it sounds as though we need to discard the old boxes and start fresh to eliminate mislabeling and allergen cross-contamination.
Nowadays there is no difference: git gc --aggressive operates according to the suggestion Linus made in 2007; see below. As of version 2.11 (Q4 2016), git defaults to a depth of 50. A window of size 250 is good because it scans a larger section of each object, but depth at 250 is bad because it makes every chain refer to very deep old objects, which slows down all future git operations for marginally lower disk usage.
git-filter-branch can be used to get rid of a subset of files, usually with some combination of --index-filter and --subdirectory-filter. People expect the resulting repository to be smaller than the original, but you need a few more steps to actually make it smaller, because Git tries hard not to lose your objects until you tell it to. First make sure that:
"git gc --aggressive" used to limit the delta-chain length to 250, which is way too deep for gaining additional space savings and is detrimental for runtime performance.
The limit has been reduced to 50.
Items (1) and (2) are good matches for an "aggressive" repack.
They ask the repack to do more computation work in the hopes of getting a better pack. You pay the costs during the repack, and other operations see only the benefit.
Item (3) is not so clear.
Allowing longer chains means fewer restrictions on the deltas, which means potentially finding better ones and saving some space.
But it also means that operations which access the deltas have to follow longer chains, which affects their performance.
So it's a tradeoff, and it's not clear that the tradeoff is even a good one.
You can see that that the CPU savings for regular operations improves as we decrease the depth.
But we can also see that the space savings are not that great as the depth goes higher. Saving 5-10% between 10 and 50 is probably worth the CPU tradeoff. Saving 1% to go from 50 to 100, or another 0.5% to go from 100 to 250 is probably not.
While git generally tries to re-use delta information (because it's a good idea, and it doesn't waste CPU time re-finding all the good deltas we found earlier), sometimes you want to say "let's start all over, with a blank slate, and ignore all the previous delta information, and try to generate a new set of deltas".
Usually there is no need to recalculate deltas in git, since git determines these deltas very flexible. It only makes sense if you know that you have really, really bad deltas. As Linus explains, mainly tools which make use of git fast-import fall into this category.
Linus ends his mail with the conclusion that git repack with a large --depth and --window is the better choice in most of time; especially after you imported a large project and want to make sure that git finds good deltas.
where that depth thing is just about how deep the delta chains can be (make them longer for old history - it's worth the space overhead), and the window thing is about how big an object window we want each delta candidate to scan.
For my 8GB computer aggressive gc ran out of memory on 1Gb repository with 10k small commits. When OOM killer terminated git process - it left me with almost empty repository, only working tree and few deltas survived.
Of course, it was not the only copy of repository so I just recreated it and pulled from remote (fetch did not work on broken repo and deadlocked on 'resolving deltas' step few times I tried to do so), but if your repo is single-developer local repo without remotes at all - back it up first.
The existing "gc --aggressive" docs come just short of recommending to users that they run it regularly.
I've personally talked to many users who've taken these docs as an advice to use this option, and have, usually it's (mostly) a waste of time.
When the --aggressive option is supplied, git-repack will be invoked with the -f flag, which in turn will pass --no-reuse-delta to git-pack-objects.
This will throw away any existing deltas and re-compute them, at the expense of spending much more time on the repacking.
The effects of this are mostly persistent, e.g. when packs and loose objects are coalesced into one another pack the existing deltas in that pack might get re-used, but there are also various cases where we might pick a sub-optimal delta from a newer pack instead.
Furthermore, supplying --aggressive will tweak the --depth and --window options passed to git-repack.
See the gc.aggressiveDepth and gc.aggressiveWindow settings below.
By using a larger window size we're more likely to find more optimal deltas.
It's probably not worth it to use this option on a given repository without running tailored performance benchmarks on it.
It takes a lot more time, and the resulting space/delta optimization may or may not be worth it. Not using this at all is the right trade-off for most users and their repositories.
As noted in that commit that makes sense, it was wrong to make more depth the default for "aggressive", and thus save disk space at the expense of runtime performance, which is usually the opposite of someone who'd like "aggressive gc" wants.
Thanks for allowing Happiest Baby to be a part of your family. Whether you're storing SNOO for your next baby or repacking it to ship back, here's your step-by-step guide on how to disassemble SNOO and repack it in the original box.
Many medicines are repackaged into monitored dosage systems ("blister packs") or plastic compartmentalised containers ("pill boxes") to aid medication adherence and safe administration, particularly in residential care facilities. However, healthcare professionals are reminded that removing some medicines from original packaging to repack them into another dosage system may adversely affect the characteristics of the medicine. Repacking medicines can impact the medicine's effectiveness and safety.
The Centre for Adverse Reactions Monitoring (CARM) recently received a report from a patient who experienced haematuria after he began repacking dabigatran (Pradaxa) into a weekly pill box. These symptoms resolved once he stopped repacking dabigatran and kept the capsules in the original container. Dabigatran capsules absorb moisture from surroundings if removed from the original packaging. Moisture absorption may increase the bioavailability of the dabigatran dose, which can in turn increase the risk of adverse effects.
This case highlights that not all medicines are suitable for repacking. In general, the original container protects the medicine from heat, air, light and/or moisture. Exposure to these elements may affect the stability of the formulation and/or the active ingredient, which can alter the effectiveness and safety of the medicine.
Some medicines also have special handling requirements. Healthcare professionals who consider repacking or administering medicines need to be aware of these for their own safety. Some examples of medicines that have special handling requirements include:
3a8082e126