Open Thermocycler Planning

132 views
Skip to first unread message

Jake

unread,
Apr 15, 2009, 5:32:25 PM4/15/09
to DIYbio
Just thought I'd start a new thread since we were kind of jacking the
Open Gel Box 2.0 update thread. We got to talking about an open
thermocycler and it seemed like a whole topic in itself.

So here is some standard PCR info to start off the discussion and
requirements...

Here is a standard protocol...

Initial Conditions:

Initial denaturation at start: 92 - 97oC for 3 - 5 min. If you
denature at 97oC, denature sample only; add rest of mix after reaction
cools to annealing temperature (prevents premature denaturation of
enzyme).

Initial annealing temperature: as high as feasible for 3 min (eg: 50 -
75oC). Stringent initial conditions mean less non-specific product,
especially when amplifying from eukaryotic genomic DNA. [note: I don't
agree with this, use a safer (lower) temp. Otherwise do 'touchdown'
PCR, which is basicaly what this is suggesting.]

Initial elongation temperature: 72oC for 3 - 5 min. This allows
complete elongation of product on rare templates.

Temperature Cycling:
92 - 94oC for 30 - 60 sec (denature)
37 - 72oC for 30 - 60 sec (anneal)
72oC for 30 - 60 sec (elongate) (60 sec per kb target sequence
length)
25 - 35 cycles only (otherwise enzyme decay causes artifacts)
72oC for 5 min at end to allow complete elongation of all product DNA

"Quickie" PCR is quite feasible: eg, [94oC 30 sec / 45oC 30 sec / 72oC
30 sec] x 30, for short products (200 - 500 bp).

YOU CAN USE GLYCEROL IN THERMAL CYCLER REACTION TUBE HOLES TO ENSURE
GOOD THERMAL CONTACTS

DON'T RUN TOO MANY CYCLES: if you don't see a band with 30 cycles you
probably won't after 40; rather take an aliquot from the reaction mix
and re-PCR with fresh reagents.
--------------------------------------------

Denaturing temperatures of 91-97C. Normal denaturation time is 1 min
at 94C.
Tm = melting temperature
Ta = annealing temperature

Tm = 4(G + C) + 2(A + T)(Deg. C)
Ta = 5C below the lowest Tm of the pair of primers to be used.

Elongation Temperature and Time is normally 70 - 72C, for 0.5 - 3 min.


Taq polymerase has a half-life of 30 min at 95C, which is partly why
one should not do more than about 30 amplification cycles. It is
possible to reduce the denaturation temperature after about 10 rounds
of amplification, as the mean length of target DNA is decreased: for
templates of 300bp or less, denaturation temperature may be reduced to
as low as 88C for 50% (G+C) templates (Yap and McGee, 1991), which
means one may do as many as 40 cycles without much decrease in enzyme
efficiency.

"Time at temperature" is the main reason for denaturation / loss of
activity of Taq: thus, if one reduces this, one will increase the
number of cycles that are possible, whether the temperature is reduced
or not. Normally the denaturation time is 1 min at 94C: it is
possible, for short template sequences, to reduce this to 30 sec or
less. Increase in denaturation temperature and decrease in time may
also work: Innis and Gelfand (1990) recommend 96C for 15 sec.

The melting temperature of a DNA duplex increases both with its
length, and with increasing (G+C) content: a simple formula for
calculation of the Tm is: Tm = 4(G + C) + 2(A + T)oC.

One consequence of having too low a Ta is that one or both primers
will anneal to sequences other than the true target, which leads to
"non-specific" amplification and consequent reduction in yield of the
desired product.

A consequence of too high a Ta is that too little product will be
made, as the likelihood of primer annealing is reduced; another and
important consideration is that a pair of primers with very different
Tas may never give appreciable yields of a unique product, and may
also result in inadvertent "asymmetric" or single-strand amplification
of the most efficiently primed product strand.

Annealing does not take long: most primers will anneal efficiently in
30 sec or less, unless the Ta is too close to the Tm, or unless they
are unusually long.

Elongation Temperature and Time is normally 70 - 72C, for 0.5 - 3 min.

At around 70C the activity is optimal, and primer extension occurs at
up to 100 bases/sec. About 1 min is sufficient for reliable
amplification of 2kb sequences (Innis and Gelfand, 1990). Longer
products require longer times: 3 min is a good bet for 3kb and longer
products. Longer times may also be helpful in later cycles when
product concentration exceeds enzyme concentration (>1nM), and when
dNTP and / or primer depletion may become limiting.

If desired product is not made in 30 cycles, take a small sample (1ul)
of the amplified mix and re-amplify 20-30x in a new reaction mix
rather than extending the run to more cycles: in some cases where
template concentration is limiting, this can give good product where
extension of cycling to 40x or more does not.

Bryan Bishop

unread,
Apr 15, 2009, 5:34:49 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 4:32 PM, Jake <jake...@mail.com> wrote:
> Here is a standard protocol...

Good god, what are you doing? *Here* is a standard protocol- :-) see,
we're already working on this.

http://heybryan.org/~bbishop/docs/protocols/pcr.xml

No need to use confusing English. The computer can (later) generate
the confusing English if you insist. :-) Maybe you could review the
pcr.xml example and compare it to the PCR protocol that you wrote up
in your email?

- Bryan
http://heybryan.org/
1 512 203 0507

Jake

unread,
Apr 15, 2009, 5:36:23 PM4/15/09
to DIYbio
One other bit about hot start PCR. There's probably a feature
involved we might need.

"Hot Start" PCR:

In certain circumstances one wishes to avoid mixing primers and target
DNA at low temperatures in the presence of Taq polymerase: Taq pol is
almost as efficient as Klenow pol at 37oC; consequently, if primers
mis-anneal at low temperature prior to initial template denaturation,
"non-specific" amplification may occur. This may be avoided by only
adding enzyme after the initial denaturation, before the reaction
cools to the chosen annealing temperature. This is most conveniently
done by putting wax "gems"TM into the reaction tube after addition of
everything except enzyme, then putting enzyme on top of the gem: the
wax melts when the temperature reaches +/-80oC, and the enzyme mixes
with the rest of the reaction mix while the molten wax floats on top
and seals the mix, taking the place of mineral oil. Information is
that "gems" may be substituted by VaselineTM.

Jake

unread,
Apr 15, 2009, 5:43:11 PM4/15/09
to DIYbio
Thanks for posting that! A lot of my cut-and-paste quickie came right
from there. Good work I must say.

I just wanted to post something in this thread to get the basic
requirements for the thermocycler down since Tito was asking. We'll
have to remember to give credit to all of the authors for helping with
design specs and reference material.

Bryan Bishop

unread,
Apr 15, 2009, 5:52:29 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 4:43 PM, Jake <jake...@mail.com> wrote:
> I just wanted to post something in this thread to get the basic
> requirements for the thermocycler down since Tito was asking.  We'll
> have to remember to give credit to all of the authors for helping with
> design specs and reference material.

Well, that's nice of you, but that's actually not my point (credit
isn't the point). The point is that a computer-readable format has
previously been discussed here (and I'm fine discussing it again of
course) and it just seems to be going backwards to have this full
block of English text about a protocol. I mean, it's good text of
course, but if we're going to be discussing specs for a machine, let's
just ask it to implement CLP-ML and be done with it. Of course, I
still think it's possible to come up with something better than
CLP-ML, or find errors in the pcr.xml example.

Some of the original discussions from way back when-

http://groups.google.com/group/diybio/browse_frm/thread/ada2289ebbc00fe0/6081750dd0eb5de1?lnk=gst&q=pcr.xml#6081750dd0eb5de1

As Mac put it-

"""
I think semantic representation of recipes / protocols is fascinating.
I could imagine using a tool to functionally define my starting
conditions and desired ending conditions and having it generate a
custom protocol for me by finding the set of protocols connected the
initial conditions to the ending conditions. Protocols are
essentially modular operations with defined inputs and outputs, so *in
principle* it would just be a matter of matching inputs and outputs
up.

For example: how would I purify a 3kb insert from a plasmid carried by
a colony I have growing on a plate?

If it were possible to build a structured representation of laboratory
operations, and we avoided getting bogged down in the semantics (is it
a material? is it a reagent?), I could imagine such a system being
used to:
- synthesize custom protocols optimized to produce a final quantity of
a desired product
- optimize workflow by finding the minimum number of operations
required to reach a desired product
- keep better track of supplies in the lab (more real-time)
- form the basis of protocol walk-through educational tools.
"""

So, anyway, I think there needs to be a way to ask whether or not the
machine can satisfy what the protocol is asking for. For instance, the
pcr.xml example isn't anything particularly fancy. However, in some
cases you need something genuinely fancy, and it would be nice to be
able to check whether or not a machine has the capabilities that
you're requesting of it. For instance, if you tag your pcr protocol
(or it has within it) a requirement about a certain maximum
temperature, then that should be within the operating range of the
thermocycler that you're considering to build, or whatever. In some
cases these are easily tweakable variables, and it doesn't matter much
and this looks like pre-emptive optimization. But as it turns out
these issues show up for every project :-) including the gel box
project, or the transilluminator, where there are different settings
and parameters you need to set everything up with. In practice, if the
thermocycler ends up with an ethernet connection, then you'd just
dump/cat the file to it, or something, or if it's USB-wired, you
wouldn't send it through /dev/eth0 but rather elsewhere like a mounted
device.

So, I do mean 'tagging' when I say it- for instance, if Jonathan
packages up his hardware in a format that he and I have been
discussing (or maybe we haven't- I forget)- then we can tell
automatically whether or not the equipment can handle a certain
protocol that you want to execute, just by comparing two pieces of
information (two files): the name of the protocol's filename, the name
of the machine's metadata/spec/packaging file. Neat, huh? Makes
everything a lot simpler. So exploring the range of different
variables that we need to address (like the temperature ranges and
volume ranges) is an important first step. But also figuring out how
to express it in a computer-readable format. (Translators for taking
pcr.xml -> English are easier than English -> xml, for anybody
wondering- unless we have some computational linguistics people
sitting around with a few tricks up their sleeves).

JonathanCline

unread,
Apr 15, 2009, 6:00:35 PM4/15/09
to DIYbio, jcl...@ieee.org
On Apr 15, 4:43 pm, Jake <jakes...@mail.com> wrote:

> I just wanted to post something in this thread to get the basic
> requirements for the thermocycler down since Tito was asking.  We'll
> have to remember to give credit to all of the authors for helping with
> design specs and reference material.


How about use-cases explained, too. A protocol doesn't really explain
why you are doing the experiment. This is analogous to Tito's
request for "describe what it is you are trying to do."

I know there are many uses for PCR; here are a couple.

1. PCR for detection: amplify DNA/RNA in order to run the result in a
gel, then discard sample and gel.

2. PCR for cloning: amplify DNA/RNA in order to get a larger sample
per volume, preserving sample.

3. ?


The real benefits of doing these individual projects (gel box,
thermocycler, spectrophotometer, etc) is that they can eventually be
integrated into a single system that auto-magically feeds results from
one step into another device. At that point, a long experiment can be
run with less intervention (possibly entirely hands-free), while
feedback is given to the user during the process.


## Jonathan Cline
## jcl...@ieee.org
## Mobile: +1-805-617-0223
########################

Nathan McCorkle

unread,
Apr 15, 2009, 7:07:21 PM4/15/09
to diy...@googlegroups.com
Bryan, can you give a brief description of the machine language/UML thing you're talking about? (without me looking at that old thread right now)

Why not just build a spec sheet for it, maybe this format you mention is just that, but really it could all be in a text file, no need for CAD drawings, etc... just give the dimensions of everything, a list of the hardware and electronics, the wiring scheme, and links/examples to/of software that would properly address each hardware interface.

-Nate
--
Nathan McCorkle
Rochester Institute of Technology
College of Science, Biotechnology/Bioinformatics

Bryan Bishop

unread,
Apr 15, 2009, 7:35:59 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 6:07 PM, Nathan McCorkle <nmz...@gmail.com> wrote:
> Bryan, can you give a brief description of the machine language/UML thing
> you're talking about? (without me looking at that old thread right now)

For protocols, you mean? I don't know which part you're talking about
since your email quotes Jonathan Cline quoting Jake. So, if you're
asking me about a representation format for biological protocols,
there are three formats that I know of- one was introduced by Mac
(EXACT), and it turned out to have some haskell programming behind it
which was a pretty big plus. The other two formats are CLP-ML and PSL.
CLP-ML was found in the literature as a way to represent clinical
laboratory protocols in XML, so the authors of the paper of course
uploaded an XML DTD as the supplementary material. In this protocol,
you specify- exactly- in a way that is *validated* by the computer (so
it can yell at you if you are being somewhat ambiguous, or something)-
what is required for the experiment, the exact steps, which materials
in particular are involved, etc. etc. I even think they went as far as
doing some crazy voodoo magic with URIs for referencing materials, but
I don't know if that's necessary (it would be neat though, but I'm not
going to implement it right away- somebody else is welcome to, of
course). The other format, PSL, is a process-specification language
that was developed by NIST in the late 90s or something. It was used
for the representation of manufacturing automation processes, such as
to compress the process-information about an automobile factory (for
building a car) into a single 'recipe'. But it turns out that the
project is dead now, and CLP-ML, IMHO, seems more alive, even though
there's nobody using it other than the people on this list who know
about it. The EXACT examples were fairly neat- they had two haskell
scripts; one was to generate the instructions for culturing yeast in
batch, and another one was the core library files. However, oddly
enough they also supplied something that looked pretty neat-
human-readable text (not haskell) that was describing what you should
do at each stage, and there were pre/post-conditions .. the odd part
is that it wasn't generated by the haskell script, so I'm left
wondering what on earth was going on there- maybe they had some other
software that they aren't sharing? I haven't asked them to share it
with me yet, so I don't know if it exists or not, and anyway it's
something that we or I should write on my own, etc.

Now, if you mean hardware packaging formats, that's another entire can
of worms. Let me get to that next.

> Why not just build a spec sheet for it, maybe this format you mention is
> just that, but really it could all be in a text file, no need for CAD
> drawings, etc... just give the dimensions of everything, a list of the
> hardware and electronics, the wiring scheme, and links/examples to/of
> software that would properly address each hardware interface.

Spec sheets are one of the most awesome things that the age of
electronics has brought about. The problem is that the majority of
spec sheets are actually just encoded into PDFs, rather into a
standardized format. However, there actually has been some work here-
there's something called ECIX, which is a way to represent electronic
datasheets in a computer-parseable way. What this allows is for
electronic design software packages (such as related to VHDL, or VLSI,
or Verilog, or sometimes just PCB-related-electronics) to pick and
choose parts that have compatible specifications. Unfortunately there
is no ECIX integration with gEDA, the GNU electronic design automation
(gEDA) package (yet). I'm sure this will change in the future-
especially once there are more datasheets published in the ECIX
format. You can google around for it. As I recall, it's basically an
XML DTD, and there's an example dot XML file that references it for
the implementation of a Samsung timer IC, or something. It even had
pin geometry layout information, which is very useful for SMT
(pick-and-place-- the giant machines that pick off electronic
components from roles and put them on breadboards based off of some
PCB definition file for an electronic circuit).

I am not opposed to a text file. Basically a text file on Windows is
just something with the ".txt" file extension (but you already know
this), and you can change which program opens up those files by
default, blah blah blah. So, the whole point in working with PyYAML is
that (1) you can use the interactive python interpreter in real time
to play around with it (if you have to- you probably won't), and (2)
YAML is human-readable, and much more friendly than XML. So, if you
have a file like "pcr.yaml", you could associate that (on Windows, or
any other operating system) with notepad, or wordpad, and it's
practically a text file, except it'll just be used somewhat
differently. I don't know if this is too basic of an explanation or
not. I do agree that the dimensions should be put into this particular
file- it's what I've been calling a metadata file.

When you say a list of the hardware and electronics, what do you mean?
What are you listing in particular? What database are you referencing?
If you say resistor, which one? Is it from a digikey catalog, or
mouser catalog? The advantage of me writing all this software for all
you other folks (hehe) is that you don't have to deal with this nitty
gritty, and just say "these parameters have to be specified", and then
the kit can be ordered with those parts, or somebody can spend some
time and write an interface to the digikey/mouser/amazon/radioshack
catalogs. I heard Best Buy recently exposed their API over the web-
too bad they don't sell things that we actually need ;-).

What do you mean when you say "links to software that would properly
address each interface"? The way that I have been dealing with this is
an algorithm to check whether or not something that says "gives 3000
psi" is wired up to something that says "5 to 20 psi acceptable range"
(more or less- the format is more general than this). This way, two
parts can be checked for compatibility with a simple python function,
or a simple program that you won't have to look at (this is backend
programming stuff, for anybody who thinks this is horrible- and yeah,
I'm being overly elaborate, but I don't think I've explained these
things properly before).

An excellent example of packaging done right is debian. I don't know
if you've ever booted up ubuntu or debian or some other linux
installation. Basically, there is a package management system. Package
management means that you don't have to sit there and install
different software: you type a command, or select a few software
packages from a list, and because there were people who made the
packages in a certain way, they automatically install (or you can
configure them to do something weird). I highly encourage you to go
download and burn an ubuntu disc. And if you're worried about that,
and if you use Windows, go get "wubi", whcih is an easier way to
install ubuntu. Personally, I don't like Windows, but I just reference
it since I have no idea what people are familiar with, in an attempt
to be verbose and helpful.

There are some documents that explain what these things are, over the web.

Debian new package maintainer's guide
http://www.debian.org/doc/maint-guide/

RPM package building guide (concise)
http://rpm.rutgers.edu/guide.html

Wikipedia on package management systems in general
http://en.wikipedia.org/wiki/Package_management_system
http://en.wikipedia.org/wiki/Software_repository
http://en.wikipedia.org/wiki/Version_control

Finally, on the topic of CAD. I think that CAD is an important thing
to include. With freely available tools like HeeksCAD, BRLCAD,
avocado, etc., there's little excuse to not just go and play around
with the systems. Note that BRLCAD is going to be complicated on
Windows (I've never tried it)- so I recommend HeeksCAD for anyone who
might not know what they are doing.

http://code.google.com/p/heekscad/

Of course, you don't have to bother yourself with CAD- if you have
JPEG drawings, you can upload them to Ponoko, and they convert it into
a 2D CAD file. Sketchup, a Google app, also does something like that
IIRC. But really, it's not as terrible as the 80s with really funky
commands just to move objects around. Unless you use BRLCAD ;-)
(unfortunately, I'm not joking). Alternatively, you can talk with me
about the files or something and I'll be glad to package it up,
especially once the software starts working better- I still recommend
formats like IGES, STEP, etc., as we were recently discussing. Maybe
there's something more I can do to help people? Just speak up ..

Meredith L. Patterson

unread,
Apr 15, 2009, 7:38:03 PM4/15/09
to diy...@googlegroups.com
Well, "CAD drawing" is just a human-viewable way of saying "DXF file"
-- DXF being the Drawing [Interchange|Exchange] Format, developed by
Autodesk way back when they were first building AutoCAD. A DXF file is
in text format, though it isn't especially human-readable. CAD
software can present it as an image; it can also be rendered to
various binary human-viewable formats, such as SVG and PDF.

Here's an example of a DXF file that a laser cutter can use in order
to cut out some of the components for a RepStrap:

http://reprap.svn.sourceforge.net/viewvc/reprap/trunk/reprap/mechanics/bits-from-bytes-designs/DXFs-for-laser-cutter/cartesian-bot-5mm-laser-cut-acetal.dxf?revision=1780&view=markup

Note that DXF is really just "the dimensions of everything"; other
problem domains have evolved their own formats, which are really
domain-specific languages, for their own needs. For example, the
Gerber format (http://en.wikipedia.org/wiki/Gerber_File) is a
text-format, not especially human-readable way to represent traces,
vias, layers, milling and drilling instructions, &c for printed
circuit boards. (If you're curious what it looks like, there's a small
sample Gerber file in the article at
http://www.artwork.com/gerber/appl2.htm .) EDA (Electronic Design
Automation) software, which is what people use to graphically lay out
PCBs, can generate Gerber files which can be sent to a fabricator; the
fab shop just loads up the Gerber file and literally prints your
circuit board.

For "software that would properly address each hardware interface",
that's where Bryan's mention of software packaging a la Debian comes
into play: a package can state what its dependencies are, e.g. what
libraries it relies on. If more than one software package would
suitably address the dependency, the user has the option to decide
which one she wants.

Tying this stuff all together is tricky work, and there will no doubt
be things that we have to implement on our own -- synthetic biology
will need to invent its own domain-specific languages in the same way
that CAD and EDA had to. But a lot of the wheels have already been
invented.

Cheers,
--mlp

Jake

unread,
Apr 15, 2009, 8:16:10 PM4/15/09
to DIYbio

In the previous thread Cory said:

I realize this would require a USB port and more processing power.
But if I'm going to ditch my old thermocycler for a home-brew machine
it needs to actually have better functionality. Also, as Jake said,
a
heated lid is a must.
-Cory


Mine doesn't have a heated lid :<

I think the USB idea is probably too complicated, or at least outside
what I've worked with. But you could have a serial connection plugged
into a cheap serial->USB adapter. I think you can get them pretty
cheap on ebay. That would also make a good cord for it since you
could just cut one end off a serial cable and wire it right to the
board.

I think with just a serial cable you could still do just about
anything you'd want. My spec just has a serial port and no software.
I have to open hyperterminal capture the output on the serial port,
save it to a file, and import it into Excel. (I'm using a serial ->
USB cable with it BTW.) From Excel I can make charts and graphs and
overlay multiple spec scan readings and save things to a file. But
all the spec does is put out data like "[nm] [TAB] [bsorbance] [CR]"
for the scan range ordered.

So I think it would be pretty easy to cobble something together.
People who start using it might desire a fancier interface and program
something in java or VB for it.

As for the actual requirements... I'm not sure exactly how
thermocycler's software works. Is it just like an automated
thermostat or does it use fancy regulation to take into account for
ramp times and thermal transfer? For heated lids do they cycle with
the same temp as the block or does it run hotter just to keep
condensation off the lid and top parts of the tube?

Talk of building a thermocycler got me to thinking about mine. It
seems pretty simple, it's got an aluminum block with about 16 holes
bored into the top for the tubes. Attached to the bottom of the block
is a peltier junction with an aluminum heatsink and fan beneath that.
For the interface it has a mambrane keypad and a one line segmented
LED display. It must also have a thermisistor somewhere to monitor
the temperature of the block, although I don't remember where. It has
an insulated, unheated lid that is hinged on the back and closes to
cover the top of the block and tubes. It also has a ventilation fan
at the back of the unit.

It has a number of built in programs and 6 or so user programmed
program memory slots. It also has a manual setting/temp hold mode.
If I turn it all the way down it will freeze a water drop placed on
the top of the block, likewise it will boil a drop when turned up.


I was thinking it might be nice to start with a small unit, say 4-6
tubes arranged in a line or small square depending on what size
peltier junctions can be had off-the-shelf for cheap. Then make the
unit small, cheap, and simple. That way you can easily keep it near
the computer and use that to program and monitor it.

It still needs to have a few buttons on it. I think all you'd really
need is a start/stop, hold, and a few buttons to select one of the
curently stored programs. For everything else you'd use the computer,
and you could also retask any of the buttons pretty easily to add
features. You'd also want a few led lights for things like cycle on/
off, etc.. Each light can be in three states: on, off, or blinking.

A cheap little microcontroller on a fairly simple and small PCB should
be able to manage all these functions. The atmega8 has 6 10-bit A/D-
converters (for temp channels or other sensors), a couple timers and a
real-time clock, three PWM channels (for driving things or speaker
noises), it also has a bunch of I/O pins that can be used for lights
or buttons.

Maybe a cheap membrane keyboard would work well. You could program
the functions, modes, and programs and have a button for each one. I
don't think you'd really need a display, although you could run a
segmented display pretty easily using the I/O lines. The display
isn't all that useful for me since it either displays cryptic patterns
(I don't have the manual) or the current temp. Since I know it works
I don't really need a temp display, it could just as easily be
replaced by an error light if the block strays too far off the program
temp..

Other features that would be nice would be a beeper or speaker for
alarms and letting you know when certain points in the cycle are hit.
Like adding Taq right at the right point after initial denaturation
for hot start modes. It would also be nice to have it beep for awhile
when done, or cool and hold your tubes until you retrieve them, or
cool them down if nobody answers the beeps after awhile. (for
forgotten rxns)

With a little controller on a simple board we could do about every
fancy feature you'd want and play with all kinds of experimental
stuff. Thinking about my thermocycler, I'd love to rip out it's
worthless and cryptic interface and replace it with an opensource
hardware design. I'd already have the hardware (block, junction,
fans, and case) so it probably wouldn't be that hard. I bet there are
lots of people out there who'd love to repair or replace their old
ones with a new computerized interface. You could also just by broken
cyclers for cheap and use the parts to build your own one with open
hardware/software.


-Jake



> The point is that a computer-readable format has
> previously been discussed here (and I'm fine discussing it again of
> course) and it just seems to be going backwards to have this full
> block of English text about a protocol.

Thanks for cluing me in on the computerized protocols Bryan. I'm just
not really up on that type of computerization. I'll have to catch up
in the future. For now I just figure we need to come up with the
minimum specs, some sort of idea how the hardware will go together,
and a list of the important features everyone wants.


-Jake

Bryan Bishop

unread,
Apr 15, 2009, 8:23:20 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 7:16 PM, Jake <jake...@mail.com> wrote:
>> The point is that a computer-readable format has
>> previously been discussed here (and I'm fine discussing it again of
>> course) and it just seems to be going backwards to have this full
>> block of English text about a protocol.
>
> Thanks for cluing me in on the computerized protocols Bryan.  I'm just
> not really up on that type of computerization.  I'll have to catch up
> in the future.  For now I just figure we need to come up with the
> minimum specs, some sort of idea how the hardware will go together,
> and a list of the important features everyone wants.

I don't see how that differs from what I'm talking about. Same thing.
Maybe you can point out anything that differs significantly. For
instance, you say that your device has 16 holes under the thermocycler
lid (where you place the micro-tubes), and those tubes each have
separate but equal operating conditions, which is exactly what you're
trying to specify. That's the metadata that the format I'm proposing
is going to store (I would say it already sort of works, but I don't
have an entire working code suite working at the moment).

Also, you mentioned hyperterminal. That brings back some old, old
memories. Is that really how you use it? Are you building your own
thermocycler, and if so, are you wiring it up to your computer and
accessing it via hyperterminal? Just wondering. I wouldn't bother with
drag/drop for excel- if you need some help, one of us (maybe even me?)
could write a program to just save the data to an OpenOffice Calc
(spreadsheet) file, or something compatible with Microsoft Excel
(maybe a CSV file- a comma-separated value format).

Out of curiosity, does anyone else use hyperterminal with other lab equipment?

Meredith L. Patterson

unread,
Apr 15, 2009, 8:36:36 PM4/15/09
to diy...@googlegroups.com
On Thu, Apr 16, 2009 at 2:16 AM, Jake <jake...@mail.com> wrote:
> I think the USB idea is probably too complicated, or at least outside
> what I've worked with.

Standalone USB controller ICs are usually in a surface-mount package
and thus a bit of a pain to solder, but if someone wanted to set up
PCB fabrication with a fab shop once the design is further along, it's
generally pretty trivial to get the fab shop to solder surface-mount
components on for boards that are meant for kits. Limor Fried's
Boarduino (http://www.ladyada.net/make/boarduino/) has a USB version,
and if you order a USB Boarduino kit you get a PCB with the USB chip
presoldered and the rest of the components in a baggie.

Also, many higher-end microcontrollers speak USB natively. I have a
bit of experience with the Freescale 8-bit micros, which are well
within the price range for a DIY project (between $3 and $12 apiece).
These chips also come in surface-mount packages, but I have
successfully soldered a Freescale micro in a QFP package by hand, and
a SchmartBoard makes it even easier.

I strongly recommend going with a micro that speaks USB. They're
really not all that expensive, and can be programmed in C. I'd be
happy to contribute to or even take the lead on programming the
firmware for a Freescale-brained thermocycler. These chips have a ton
of GPIO lines that can be used to control the peltier junction, send
output to an LED display or LCD, control a ventilation fan (if needed;
that could just be always-on) and read input from a thermistor.

> A cheap little microcontroller on a fairly simple and small PCB should
> be able to manage all these functions. The atmega8 has 6 10-bit A/D-
> converters (for temp channels or other sensors), a couple timers and a
> real-time clock, three PWM channels (for driving things or speaker
> noises), it also has a bunch of I/O pins that can be used for lights
> or buttons.

I'm not sure that the 8K of Flash that the ATMega8 provides would be
enough to hold all the stored programs that are needed, but I tend to
be pessimistic about that sort of thing, and there are a lot of tricks
for optimizing program size. I'll have my Arduinos back in about a
week, though, so I can start screwing around with putting something
together on that platform.

Arduino images tend to be a bit bloaty, but one argument for using
that platform -- or possibly the new ATMega128 version -- is that the
barrier to entry for learning to code for them is extra-specially low,
so it might make sense to target the AVR on the grounds that it
becomes easier for people to contribute to the project. And, again,
you get onboard USB. Plus there's nothing stopping us from putting the
Arduino firmware on a custom PCB, should we so desire. This would
actually make it *really* easy to upgrade the thermocycler's firmware
-- plug in USB cable, load up new image, boom, you're done.

> Other features that would be nice would be a beeper or speaker for
> alarms and letting you know when certain points in the cycle are hit.
> Like adding Taq right at the right point after initial denaturation
> for hot start modes. It would also be nice to have it beep for awhile
> when done, or cool and hold your tubes until you retrieve them, or
> cool them down if nobody answers the beeps after awhile. (for
> forgotten rxns)

Ooh, I like those ideas. Easy to program, too.

> You could also just by broken
> cyclers for cheap and use the parts to build your own one with open
> hardware/software.

I may do just this when I get some free time.

Cheers,
--mlp

Tito Jankowski

unread,
Apr 15, 2009, 8:54:05 PM4/15/09
to diy...@googlegroups.com
What about running the Open Thermal cycler like a network device + web interface? Ever configured a Linksys router? (i.e. a control interface like the configuration interface on a router or wireless AP)

Tito

Bryan Bishop

unread,
Apr 15, 2009, 9:08:02 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 7:54 PM, Tito Jankowski <titoja...@gmail.com> wrote:
> What about running the Open Thermal cycler like a network device + web
> interface? Ever configured a Linksys router? (i.e. a control interface like
> the configuration interface on a router or wireless AP)

I would also make an analogy to networked printers. Anybody with
access can print through cupsd or some networked-printing-server. Most
printers out on the market now have a built-in web server and know
about connecting to different types of networks, like SAMBA. So, if
you're going to implement a networked thermocycler (i.e., it's going
to have an IP address and all), consider looking into printer server
architectures for networked devices. Also, for investigating router
software, see OpenWRT.

OpenWRT
http://openwrt.org/

"OpenWrt is described as a Linux distribution for embedded devices.
Instead of trying to create a single, static firmware, OpenWrt
provides a fully writable filesystem with package management. This
frees you from the application selection and configuration provided by
the vendor and allows you to customize the device through the use of
packages to suit any application. For developer, OpenWrt is the
framework to build an application without having to build a complete
firmware around it; for users this means the ability for full
customization, to use the device in ways never envisioned."

Oh. Hm. I always thought OpenWRT was for reflashing netgear firmware
for ethernet/wireless routers. Guess it has other uses.

Meredith L. Patterson

unread,
Apr 15, 2009, 9:14:13 PM4/15/09
to diy...@googlegroups.com
On Thu, Apr 16, 2009 at 3:08 AM, Bryan Bishop <kan...@gmail.com> wrote:
>
> On Wed, Apr 15, 2009 at 7:54 PM, Tito Jankowski <titoja...@gmail.com> wrote:
>> What about running the Open Thermal cycler like a network device + web
>> interface? Ever configured a Linksys router? (i.e. a control interface like
>> the configuration interface on a router or wireless AP)
>
> I would also make an analogy to networked printers. Anybody with
> access can print through cupsd or some networked-printing-server. Most
> printers out on the market now have a built-in web server and know
> about connecting to different types of networks, like SAMBA. So, if
> you're going to implement a networked thermocycler (i.e., it's going
> to have an IP address and all), consider looking into printer server
> architectures for networked devices. Also, for investigating router
> software, see OpenWRT.

Using a microcontroller that supports OpenWRT will increase costs
enormously and open up security holes. I don't want an operating
system on my thermocycler.

I have some ideas about writing a dedicated interface (that will speak
a very limited subset of HTTP, and spit out HTML) on top of uIP
(http://www.sics.se/~adam/uip/index.php/Main_Page) that I'll have to
elaborate on later, since I need to help Len pack for CodeCon, but
don't let me forget this.

Cheers,
--mlp

JonathanCline

unread,
Apr 15, 2009, 10:48:41 PM4/15/09
to DIYbio
On Apr 15, 6:35 pm, Bryan Bishop <kanz...@gmail.com> wrote:

> there are three formats that I know of- one was introduced by Mac
> (EXACT), and it turned out to have some haskell programming behind it
> which was a pretty big plus. The other two formats are CLP-ML and PSL.
> CLP-ML was found in the literature as a way to represent clinical
> laboratory protocols in XML, so the authors of the paper of course
> uploaded an XML DTD as the supplementary material.


An average laptop has plenty of lexical analyzing power to parse near-
english equivalent protocols. Forcing biologists to write their
recipes in some ascii-but-otherwise-bizarre format is probably going
to progress slowly. Biologists become biologists usually because they
shy away from technology, so even asking these end user to learn HTML
is out of their focus. Throw computation power at the problem rather
than human effort at the problem. Let the biologists remain experts
in biology.

In the cooking world, there are/were lexical analyzers to convert
"plain text cooking recipes" into computer readable interchange format
for data storage and sharing. The "plain text recipe format" was
pretty loose on the whitespace acceptance, measurement style, verb/
noun placement, etc. Biologists can write english with specific
vocabulary and it can be computer readable. They should do more of
that, in a regularly-formatted way.

I haven't seen that many biology protocols of course, other than
what's on OWW and http://www.protocol-online.org . I find the
pcr.xml to be completely unreadable and unwrittable compared to what's
on the wikis. The following PCR example has a good visual format and
is likely very parsable by a laptop into "machine-code":
http://kuchem.kyoto-u.ac.jp/seika/shiraishi/protocols/cloning_PCR_products.html

Or has lexical analyzer been done in bio already?

Bryan Bishop

unread,
Apr 15, 2009, 10:56:48 PM4/15/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 9:48 PM, JonathanCline <jnc...@gmail.com> wrote:
> On Apr 15, 6:35 pm, Bryan Bishop <kanz...@gmail.com> wrote:
>> there are three formats that I know of- one was introduced by Mac
>> (EXACT), and it turned out to have some haskell programming behind it
>> which was a pretty big plus. The other two formats are CLP-ML and PSL.
>> CLP-ML was found in the literature as a way to represent clinical
>> laboratory protocols in XML, so the authors of the paper of course
>> uploaded an XML DTD as the supplementary material.
>
> An average laptop has plenty of lexical analyzing power to parse near-
> english equivalent protocols.  Forcing biologists to write their

I'd like you to show me that. Genuinely, I would, I would love to see
that in action.

> recipes in some ascii-but-otherwise-bizarre format is probably going
> to progress slowly.  Biologists become biologists usually because they
> shy away from technology, so even asking these end user to learn HTML

Nobody is asking them to learn HTML. I've talked to you about writing
fancy frontends and wizards, or letting them talk to 'package
maintainers' who know the super secret ninja arts (or whatever we're
calling this).

> is out of their focus.   Throw computation power at the problem rather
> than human effort at the problem.  Let the biologists remain experts
> in biology.

Sorry, but you can't just assume a computer is magic.

> In the cooking world, there are/were lexical analyzers to convert
> "plain text cooking recipes" into computer readable interchange format

Please show me. The only reference on this I can find was from a book
published in 1985 called Computational Recipes where a fellow named
David came up with a polish or posix-style notation representation of
cooking. It turns out he had a consulting business in industrial
kitchen automation, or something. But now he doesn't seem to be around
on the internet.

> for data storage and sharing.   The "plain text recipe format" was
> pretty loose on the whitespace acceptance, measurement style, verb/
> noun placement, etc.   Biologists can write english with specific
> vocabulary and it can be computer readable.  They should do more of
> that, in a regularly-formatted way.

I don't see how this is different from using particular interfaces to
validate their grammars and so on. What's the big deal?

> I haven't seen that many biology protocols of course, other than
> what's on OWW and http://www.protocol-online.org .   I find the
> pcr.xml to be completely unreadable and unwrittable compared to what's

Yeah, that's not the human readable output. I would write a generator
that would take that information and generate instructions, plus
information on each of the pieces of equipment if necessary. Think of
it as a verbosity flag on some shell program (ok, then imagine
whatever GUIs you like on top of that). Btw, that's why I was
originally suggesting YAML for the representation of recipes- it would
be much more easy to read and even human-writable, but at the same
time it's an equivalent format for object serialization for python,
perl, and whatever other languages have YAML-implementations.

> on the wikis.   The following PCR example has a good visual format and
> is likely very parsable by a laptop into "machine-code":
> http://kuchem.kyoto-u.ac.jp/seika/shiraishi/protocols/cloning_PCR_products.html

While it looks good, I've seen some pretty terrible things done with
that format. The other day I was reading a protocol that asked me to
then apply "2 volumes EtOh". wtf?

> Or has lexical analyzer been done in bio already?

Dunno, but I'd like to hear about it.

Jason Morrison

unread,
Apr 16, 2009, 12:02:21 AM4/16/09
to diy...@googlegroups.com, kan...@gmail.com
I just want to chip in real quick and say that I'm loving this
conversation. Two things, in particular:

1. The idea of computer-readable protocols. The benefits of this are manifold.
2. The fact that aiming for computer-readable protocols does not mean,
in any way, that we have to write them in raw <inscrutable and arcane,
however logical it may be, computer format here>. We should be able
to specify them in a reasonably intuitive manner. There are plenty of
graphical metaphors for setting out a list of instructions that talk
about well-described operations with well-formed units; imperative
flow charts, declarative data flow "patches," dependency graphs, etc.

I call it a graphical programming language, but call it whatever you
like; in my opinion, protocols and workflows should look like this, if
not more beautiful:

http://www.omnigroup.com/images/applications/omnigraffle/features/smartguides.jpg
http://www.sorosy.com/lego/halloweenclaw/images/screenshot.PNG
http://sintixerr.files.wordpress.com/2009/02/simplevizjpg-ready.jpg

Software could also optimize *your* time - if it knows when you have
to be attending a protocol, and when you can "just let it simmer,"
then you could tell your computer all the experiments you'd love to
do, and it could give you a schedule of what to do this week,
accounting for how many pieces of equipment you have, what products of
which protocol feed into others, who else is using your lab, and when
you'd like to take a lunch break.

Thanks goodness we're living in the future! and all this is possible,
Jason

--
Jason Morrison
jason.p....@gmail.com
http://jayunit.net
(585) 216-5657

Bryan Bishop

unread,
Apr 16, 2009, 12:19:18 AM4/16/09
to Jason Morrison, kan...@gmail.com, diy...@googlegroups.com
On Wed, Apr 15, 2009 at 11:02 PM, Jason Morrison
<jason.p....@gmail.com> wrote:
> I just want to chip in real quick and say that I'm loving this
> conversation.  Two things, in particular:
>
> 1. The idea of computer-readable protocols.  The benefits of this are manifold.

It's interesting though that this hasn't actually happened yet though.
Why is it that we can count the number of examples with our fingers? I
mean, to me, this seems fairly obvious, intuitive, even easy. But on
the other hand, it just doesn't really exist at the moment. I do know
however that there has been significant push to get bioinformatics
databases in better states- there was some letter circulating around
in the bioinformatics journals about standards, or something. But I
don't remember anything in that 'open community letter' about protocol
representation.

> 2. The fact that aiming for computer-readable protocols does not mean,
> in any way, that we have to write them in raw <inscrutable and arcane,
> however logical it may be, computer format here>.  We should be able

That's right- we can have it so that computer-readable protocols are
human-readable, or can be transformed into human readable forms. Also,
the idea of parsing human readable text, into something that is a
structured computer form- kind of like metadata structure in OCR or
something. But in general I wouldn't trust this going over the
protocol-online.org dataset, which is not a structured dataset.

>  I call it a graphical programming language, but call it whatever you
> like; in my opinion, protocols and workflows should look like this, if
> not  more beautiful:

I do some work in a lab that generates graphs through an open source
program called 'graphviz'. Essentially, what we do is convert
functional structure diagrams to these sorts of visual graphs. They
are not representations of programming grammars, though.

So, protocols could be converted to graphical visualizations, but I
don't know if you're actually talking about something like 'turtle',
the graphical programming language, or scratch.mit.edu, or something.

> Software could also optimize *your* time - if it knows when you have
> to be attending a protocol, and when you can "just let it simmer,"
> then you could tell your computer all the experiments you'd love to
> do, and it could give you a schedule of what to do this week,
> accounting for how many pieces of equipment you have, what products of
> which protocol feed into others, who else is using your lab, and when
> you'd like to take a lunch break.

Yes. Schedule optimization, like when you should do what, is also a
problem that computers should solve. When I first entered university,
I spent a few hours one day with some sticky notes and a calendar
trying to get a course schedule optimized. After a while I just sat
there and figured that this is a problem that my computer should be
solving- so I wrote a schedule optimizer, a combinatorial constraint
solver sort of, except it's not generic and it's highly constrained to
that particular problem space. I think problems like these that occur
in the lab (like what to do first, or how to organize a project or
something) could be done with computational methods, very easily, or
"saved" results could be used again if you like some particular
organizational scheme to conducting a certain type of protocol, or
something.

> Thanks goodness we're living in the future! and all this is possible,

- Bryan

Nathan McCorkle

unread,
Apr 16, 2009, 12:40:34 AM4/16/09
to diy...@googlegroups.com
Bryan, can you give a brief description of the machine language/UML thing you're talking about? (without me looking at that old thread right now)

Why not just build a spec sheet for it, maybe this format you mention is just that, but really it could all be in a text file, no need for CAD drawings, etc... just give the dimensions of everything, a list of the hardware and electronics, the wiring scheme, and links/examples to/of software that would properly address each hardware interface.
 

 I thought we were focusing mainly on the electronics now, so most of my previous post was geared in computer engineering.

By a list of hardware and electronics, I mean just that, I think we should develop a simple list of parts and their technical identification specifics. Just have a list of the needed devices, with manufacturer, part ID, maybe a resaler's web link, and a quick description of the part. Then throw in a pinout schematic with parts labelled, etc, and maybe a pre-arranged PCB board file too.When I asked about software device linking, I meant the actually microcontroller code, for the whatever controller is in the final design... such as binary modulating of a relay that controls the peltier junction, or PWM on the peltier, as controlled by some thermocouple, thermistor, or digital two-wire temperature sensor. Time Clock, USB I/O, maybe ethernet as well, or possibly just two-wired to an optional netmedia siteplayer chip package for web functionality.

Work on the basic piece of equipment, get it, then program OCR software and crazy algorithms to translate laboratory experiment protocols. All PCR is concerned about is temperatures and times, if we have a solid base piece of hardware, it should be no problem to easily implement the functionality you mentioned.

Jason Morrison

unread,
Apr 16, 2009, 12:44:14 AM4/16/09
to Bryan Bishop, diy...@googlegroups.com
>> 1. The idea of computer-readable protocols.  The benefits of this are manifold.
> It's interesting though that this hasn't actually happened yet though.
> Why is it that we can count the number of examples with our fingers?

Perhaps it's that, at least today, one actually does have to write all
this XML if we want a computer-readable protocol. The lack of a
usable design tool means that you have to care about this in order to
write protocols in a computer-readable way. If a piece of software
existed that (1) made it easy to write protocols and (2) made it easy
to read/share/follow/measure/track/etc. protocols (i.e. a value add
above transcribing), and it was made known, I think we might see more
of these documents pop up.

An alternate question is: imagine a universe with no CAD software, and
asking "why is the only DXF file the one written by the DXF spec
author?"

> So, protocols could be converted to graphical visualizations, but I
> don't know if you're actually talking about something like 'turtle',
> the graphical programming language, or scratch.mit.edu, or something.

Yes, I mean a graphical design tool that hides the "source code" from
the protocol expert (biologist, chemist, ...) and lets them think in
terms of their domain. Certainly, the same tool could read a
computer-readable protocol file and produce a flowchart, or English
instructions (or German or Mandarin, for that matter), or a daily
schedule.

> When I first entered university,
> I spent a few hours one day with some sticky notes and a calendar

> trying to get a course schedule optimized...

Ha! I got a kick out of that story :) Luckily, some clever
programmers at my school built a scheduling tool that scraped course
day/times from the university course registration server and made this
a super simple process, so I chose to not have the pleasure of writing
that myself.

Anyhow, glad to see the discussion on computer-readable protocols.
Back to hacking hardware so this isn't all daydreaming, I suppose...

-j

Bryan Bishop

unread,
Apr 16, 2009, 12:49:32 AM4/16/09
to diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 11:40 PM, Nathan McCorkle <nmz...@gmail.com> wrote:
>> Bryan, can you give a brief description of the machine language/UML thing
>> you're talking about? (without me looking at that old thread right now)
>>
>> Why not just build a spec sheet for it, maybe this format you mention is
>> just that, but really it could all be in a text file, no need for CAD
>> drawings, etc... just give the dimensions of everything, a list of the
>> hardware and electronics, the wiring scheme, and links/examples to/of
>> software that would properly address each hardware interface.
>
>  I thought we were focusing mainly on the electronics now, so most of my
> previous post was geared in computer engineering.

Oh, maybe we were. Yeah, so computer engineering projects are still
within the domain. I know this isn't what you were talking about, but
if you haven't seen the packaging that the opencores people are doing,
you should go take a look- that's computer engineering at its finest.
But just for electronics and stuff, ok- thanks for the clarification.

> By a list of hardware and electronics, I mean just that, I think we should
> develop a simple list of parts and their technical identification specifics.
> Just have a list of the needed devices, with manufacturer, part ID, maybe a
> resaler's web link, and a quick description of the part. Then throw in a

Well, that's what I'm trying to move away from. A thousand links of a
thousand different parts is a step backwards- I want to unify that
interface so that I don't have to hunt down all of the parts all the
time. This is somewhat the idea of octoparts.com, although it kind of
fails to work. But yeah, that's the basic idea, that there has to be a
way to actually implement the kits or projects and packages and so on-
either with parts that *you* make, or parts that you buy or somehow
happen to have in your inventory, and the various options for buying
them, just like you have various options for which mirror you want to
download stuff from when you use sourceforge or various
high-visibility projects that need that sort of infrastructure.

> pinout schematic with parts labelled, etc, and maybe a pre-arranged PCB
> board file too.When I asked about software device linking, I meant the

IIRC, there are a few standard formats for PCB file representation. I
think gEDA had some free tools for this somewhere.

> actually microcontroller code, for the whatever controller is in the final

Microcontroller code stuffs have already hit the repositories, I seem
to recall an arduino-specific code repository, another one about
buglabs, and I'm sure I've seen microcontroller stuff in the debian
repositories that I'm wired up to.

> design... such as binary modulating of a relay that controls the peltier
> junction, or PWM on the peltier, as controlled by some thermocouple,
> thermistor, or digital two-wire temperature sensor. Time Clock, USB I/O,
> maybe ethernet as well, or possibly just two-wired to an optional netmedia
> siteplayer chip package for web functionality.

Yep, abstracted functions that any thermocycler should implement.

> Work on the basic piece of equipment, get it, then program OCR software and
> crazy algorithms to translate laboratory experiment protocols. All PCR is

Well, I don't think that we should have to do OCR :-) that's the hard
crazy stuff. OCR, human language analysis stuff, that's what we should
be avoiding. Unless we happen to have somebody who is competent in
that area hanging around here and willing to speak up. :-)

> concerned about is temperatures and times, if we have a solid base piece of
> hardware, it should be no problem to easily implement the functionality you
> mentioned.

- Bryan

Bryan Bishop

unread,
Apr 16, 2009, 1:10:13 AM4/16/09
to Jason Morrison, diy...@googlegroups.com, kan...@gmail.com
On Wed, Apr 15, 2009 at 11:44 PM, Jason Morrison
<jason.p....@gmail.com> wrote:
>>> 1. The idea of computer-readable protocols.  The benefits of this are manifold.
>> It's interesting though that this hasn't actually happened yet though.
>> Why is it that we can count the number of examples with our fingers?
>
> Perhaps it's that, at least today, one actually does have to write all
> this XML if we want a computer-readable protocol.  The lack of a
> usable design tool means that you have to care about this in order to
> write protocols in a computer-readable way.  If a piece of software
> existed that (1) made it easy to write protocols and (2) made it easy
> to read/share/follow/measure/track/etc. protocols (i.e. a value add
> above transcribing), and it was made known, I think we might see more
> of these documents pop up.

1) Making it easy to write protocols. So, the one thing that I was
previously working on was a front-end wizard-guide-style program that
would help people as they are writing a protocol. This would be the
hand-holding version. It would work just like the typical
ask-a-question-then-let's-go-to-that-datatype-input software that you
see in introductory programming books (only because I don't have a
better idea). This would also work for asking the programmer whether
or not the information extracted from a parsed file is correct or not-
but I really don't know how to write English language parsers that are
that powerful. :-( I guess I have some reading to do. Any suggestions?
Another way to write protocols would be something like a graphical
point-and-click thing, sure, but really I'm more interested in the
underlying structure, because I'm not too fast with a mouse, whereas
with my keyboard, my fingers fly.

2) Make it easy to read, share, follow, measure, track protocols. So,
I think reading, sharing and following the protocols is possible. This
is the idea of a web repository frontend, which is this tinky
ikiwiki+git frontend for the YAML database stuff of metadata for
different open source hardware projects, or different protocol
implementations (i.e., whether it's a PCR machine, thermocycler, or
just a symphony of tubes being orchestrated by human manual labor).
But this comes a bit later after some of the basics are set up.
Preserving both the ability to download the information but also share
it would involve something like hashes or something, which is
irrelevant to everybody except someone interested in the backend
'magic' of it all.

> An alternate question is: imagine a universe with no CAD software, and
> asking "why is the only DXF file the one written by the DXF spec
> author?"

It's the chicken-and-egg problem: as soon as you have enough DXF
files, some tool will be written to help manage the files. Of course,
nobody wants to manage the files until that tool is written.

>> When I first entered university,
>> I spent a few hours one day with some sticky notes and a calendar
>> trying to get a course schedule optimized...
>
> Ha!  I got a kick out of that story :)  Luckily, some clever
> programmers at my school built a scheduling tool that scraped course
> day/times from the university course registration server and made this
> a super simple process, so I chose to not have the pleasure of writing
> that myself.

Yep, that's exactly what I did. Web scraping and everything w/ perl's
WWW::Mechanize. Good times. :-/

Nathan McCorkle

unread,
Apr 16, 2009, 1:29:11 AM4/16/09
to diy...@googlegroups.com
A thousand links of a
thousand different parts is a step backwards- I want to unify that
interface so that I don't have to hunt down all of the parts all the
time.


Thousands of parts??? I think the list for a thermocycler would be less than 50. And why would you always have to hunt down products? If it's designed/built poorly and parts fail, or you're building them to throw away, then I could see why you would "hunt down all the parts all the time"... but really, even when someone would want to sell these assembled, they would still just have to order more components, num parts doesnt change the amount of time it takes to order them!

I Dunno, maybe I am not understanding something, or maybe I am just more confortable with a regular straighforward parts list. Buy these parts, solder like such, place in enclosure, connect power supply and data cables.

Bryan Bishop

unread,
Apr 16, 2009, 1:33:10 AM4/16/09
to diy...@googlegroups.com, kan...@gmail.com
On Thu, Apr 16, 2009 at 12:29 AM, Nathan McCorkle <nmz...@gmail.com> wrote:
>> A thousand links of a
>> thousand different parts is a step backwards- I want to unify that
>> interface so that I don't have to hunt down all of the parts all the
>> time.
>>
>
> Thousands of parts??? I think the list for a thermocycler would be less than
> 50. And why would you always have to hunt down products? If it's
> designed/built poorly and parts fail, or you're building them to throw away,
> then I could see why you would "hunt down all the parts all the time"... but
> really, even when someone would want to sell these assembled, they would
> still just have to order more components, num parts doesnt change the amount
> of time it takes to order them!

I think you will know the pains I'm talking about when I tell you to
get some item named 'dfuo3ur0892409814' and you don't actually know
what it is or where to acquire one .. these problems intensify as you
get deeper and deeper into obscure machines. Luckily, machines with
something like 15 parts, like a thermocycler, aren't going to be like
that, but in general- this is infrastructure. I think fenn wrote about
this problem the other day pretty well too:

http://groups.google.com/group/openmanufacturing/msg/19e96c5a6c50162e

"""


> That makes sense. Labs therefore need to replicate to meet demand as it is
> made practical.

Labs need to focus on automation so that they aren't subjecting their
members to excessive opportunity costs by making everything from scratch.

If you spend 1000 man-hours building the equivalent of a $1000 machine,
have you gained or lost? What about the second or third time you do it?

> For those on this list knowledgeable of working with metals and
> refrigeration. Is fabbing a Fridge from scratch feasible?

Depends what you mean by "scratch" - I'd define scratch as the most
abundant minerals in earth's crust, water, sunlight, and air. Given that,
your "refrigerator" would probably end up looking something like a fat
tree. Not a bad design, all things considered, but our technology simply
isn't there yet.

Now if you take "scratch" as "anything you can buy at a Home Depot then
the situation changes, but is that really what you mean?

> What are the steps and how long would it take with optimal tools and
> processes?

Like a refrigerator factory?

The basic components of a conventional refrigerator are:

- refrigerant, a chemical that has a high heat of vaporization and
boiling point near or below the desired temperature; propane or
ammonia are easy to come by. for this you'd need feedstocks,
distillation apparatus, analysis instruments, containment tanks

- compressor motor, usually an electric motor permanently sealed inside
the refrigerant plumbing. requirements: wire rolling and drawing,
brushes or steel laminations, bearings, switches

- compressor pump, requires high precision seals if the motor is not
enclosed in the system. otherwise it's just a relatively complicated
precision mechanical device with multiple bearings and sliding seals..

- heat exchangers, just a long length of copper or aluminum
tubing with fins soldered on. these would have to be made in some
seamless process, probably with a floating mandrel, which is tricky:
http://www.howellmetal.com/HowellMetal/Portals/57ad7180-c5e7-49f5-b28...

- circulation fan. same as for compressor motor, with some bent sheet
metal.

- defrost heater, a quartz tube with some nichrome heating wire inside.
no idea how to make these.

- thermostat, usually a long strip of two dissimilar metals bonded together.
supposedly they can simply be riveted together, but i always seem them
as a continuous smooth strip

- insulation, often polyurethane foam, but fiberglass and styrofoam are
almost as good. is there an organic chemist in the house?

- shelving, drawers, handles, exterior case, and frame. easy enough if
you know how to make sheet metal.

so based on the level of technology infrastructure required to make each
one of those, i'd say it's not feasible to make from pure chemicals. It's
a lot more attractive idea if you can buy or trade for semi-finished
stock, that is to say, tubing of consistent diameter, sheet metal, wire.
(but if you're doing that, why not just buy a compressor motor? why not
buy or scrounge a fridge?)

you can make a decent produce extender by putting one unglazed ceramic pot
inside another and filling the space between with water.

this isn't even getting into alternative designs like thermoacoustics or
hilsch tubes, which could simplify the technology tree at a cost in
performance and efficiency.

see why we need SKDB?
"""

> I Dunno, maybe I am not understanding something, or maybe I am just more
> confortable with a regular straighforward parts list. Buy these parts,
> solder like such, place in enclosure, connect power supply and data cables.

Overall that will be the end result, yes. Think of this as infrastructure.

JonathanCline

unread,
Apr 16, 2009, 1:56:52 AM4/16/09
to DIYbio
On Apr 16, 12:10 am, Bryan Bishop <kanz...@gmail.com> wrote:

> but I really don't know how to write English language parsers that are
> that powerful. :-( I guess I have some reading to do. Any suggestions?


man flex
(or for fun, find an implementation of "adventure shell")

It could be I'm simplifying biology protocols. So far I'm convinced
that a lab robot could run straight from protocol-online.org
"directions". The difficult part of bio is "knowing what to try when
it doesn't work" (hard AI), not actually performing the individual
steps (easy AI).

http://labs.fhcrc.org/gottschling/Bacterial%20Protocols/miniprep.html
http://www.genome.ou.edu/protocol_book/protocol_partIII.html#III.C

- there are not that many verbs, and the usage is very precise and
consistent (yay, science).
- there are not that many nouns (any that don't match can be flagged
to be added to the dictionary)
- the measurements are in consistent units
- steps are listed as individual bullet or number points
= the lexical space is relatively small.


Writing lexical analyzers can be really quite fun, you might enjoy it
a lot. Long ago I wrote a couple talking automatons (s/w robots) that
would chat each other up (as well as any other chat users) in a chat
system based on an invented dictionary of dialogue with verb-noun
parsing, in a language that a friend of mine invented as a combination
of perl and C (it was basically C with strings built in), which was
compiled in his compiler written using lex. I guess there's not too
many people around anymore who know lex or it's precursors.

At least it's worth looking into the complexity of this as a solution
before trying to shoehorn bio into any-ML at the user-entry level.
*ML is a computer-to-computer interchange format and should ideally
never touch the screen or come from a user's keyboard (the exception
being the developers themselves). Especially ML should never reach
the biologists.. they're timid & scare easily. ;-D


Getting a bit off topic on this thread.

JonathanCline

unread,
Apr 16, 2009, 2:28:54 AM4/16/09
to DIYbio
On Apr 15, 7:16 pm, Jake <jakes...@mail.com> wrote:
>
> I think the USB idea is probably too complicated, or at least outside
> what I've worked with.  But you could have a serial connection plugged
> into a cheap serial->USB adapter.


I agree. USB is a very complicated phy and protocol (multiple
profiles to support many device types), and even big companies have
difficulty supporting specialized USB products, mostly as a result of
USB driver issues on Microsoft operating systems. The saving grace is
that several different microcontrollers these days not only have built-
in USB phy, they also have built-in USB->serial firmware which uses
the standard USB protocol framework for serial devices (shipped in all
o/s), and the firmware is made freely available for the
microcontrollers. So there is no USB driver to write on the o/s side
and no microcontroller USB firmware to write on the circuit side
either. The device acts like a serial port from the computer side.
From the microcontroller side it acts like a character buffer.

I'm using this one: http://www.schmalzhaus.com/UBW/index.html

Microchip PIC with built-in USB physical layer port, and USB-to-serial
firmware written by Microchip. Board layout done by a DIY guy. When
it's plugged into a computer, it looks like a serial port with an
infinitely large buffer and autobaud rate.

In the default software, sending simple serial commands will act on
the GPIO's. There's also commands for controlling the other
peripherals (PWM, I2C, A/D, etc). In java, the microcontroller is
accessible with the standard serial framework. The toughest part was
string-to-character and character-to-string conversion because java's
too rigid to allow native 8-bit character conversion without a minor
hassle.

Because of this, low bandwidth (<500kbit) USB is now easy, the work is
already done.

Nathan McCorkle

unread,
Apr 16, 2009, 11:52:11 AM4/16/09
to diy...@googlegroups.com
It seems like you agree that it's too tough, and then say how the USB to serial driver is shipped with most every OS standard... and list that even java can access it natively, that sounds easy and desirable.

Bryan Bishop

unread,
Apr 16, 2009, 12:04:25 PM4/16/09
to diy...@googlegroups.com, kan...@gmail.com
On Thu, Apr 16, 2009 at 12:56 AM, JonathanCline <jnc...@gmail.com> wrote:
> On Apr 16, 12:10 am, Bryan Bishop <kanz...@gmail.com> wrote:
>> but I really don't know how to write English language parsers that are
>> that powerful. :-( I guess I have some reading to do. Any suggestions?
>
> man flex

I'm vageuly familiar with flex, bison and yacc. So, BNF grammars and
such. Okay. When I look at these grammars, they are mostly used to
define the syntax and grammar for programming languages, which is then
used as input to a compiler-compiler. Things tend to crash when they
come on to syntax errors (as this poorly formatted, unstandardized
text tends to be) and other problems. It would be absolutely awesome
if we're somehow able to convert all of the information into a
computer-readable format via a flex-based grammar, plus marking of
which files were incorrectly interpreted or parsed, plus some
non-negligible number of protocols which are correctly converted into
(something). Do you have any references on either software that does
this, or something from the literature demonstrating this on large,
poorly structured datasets? Worse than fuzzy.

> (or for fun, find an implementation of "adventure shell")

You mean like the old infocom adventure game stuff?

> It could be I'm simplifying biology protocols.   So far I'm convinced
> that a lab robot could run straight from protocol-online.org

I have seen so many, many problems with the protocols there. I'll make
a list of just some stuff I've noticed, stuff that would cause a
terrible headache.

* out of order sections
* different formatting for the representation of different sections
* grammatical references to previous steps
* references to previous steps but using different terminology not
explicitly defined anywhere else in the same document (i.e., "after
the digestion step" but it didn't tell you that four steps ago you
were using a restriction enzyme to digest a DNA strand)
* ambiguity on what constitutes a 'step' in the process versus when
it's a 'note' or just an added piece of information.
* "magic human touch" protocols- we've all seen these- where there's
something "magical" that some poor graduate student had to master by
hand, or something (so this wouldn't be a good protocol to waste your
time on anyway)
* special information in diagrams that can't easily be extracted into
the "flow of text" for proper parsing
* mangled information- sometimes the weird representation of the
protocol that was chosen was because the traditional format doesn't
apply for some reason (structure is function)
* etc.

Now, all of those (or almost all of those) are problems that can be
solved by tiny hacks to a parser. But I really don't see the benefit-
especially because of the large variation of the different ways of
representing the protocols that is used across protocol-online.org. In
some cases there are subsets of the dataset that are formatted in the
same way, and I think it would be useful to be able to convert those
subsets (but I haven't found them yet) that follow the same "grammar"
(even though we don't know what grammar that actually is (hm a tough
one)). Those subsets would make good starting spots for converting
protocols or running the parser on. Could you help convince me how an
amazing super grammar can be written with flex to solve this problem?

> "directions".  The difficult part of bio is "knowing what to try when
> it doesn't work" (hard AI), not actually performing the individual
> steps (easy AI).

Right, if you get to assume hard ai then yes, everything becomes
magically easier- of course. I spent yesterday talking with for a few
hours with somebody who wants to do AGI research- so my thoughts on
this are kind of all fuzzy and I have no idea if you're talking about
ai in the sense that AGI people talk about ai, or the Peter Norvig
sense of fancy tricks with SVMs and machine "learning" algorithms
(which aren't actually about the biological analogs of learning (which
isn't a bad thing, it's just hard to distinguish them for an untrained
eye)).

> http://labs.fhcrc.org/gottschling/Bacterial%20Protocols/miniprep.html
> http://www.genome.ou.edu/protocol_book/protocol_partIII.html#III.C
>
> - there are not that many verbs, and the usage is very precise and
> consistent (yay, science).

Not really. Sometimes you even see domain-specific terminology or verbiage.

> - there are not that many nouns (any that don't match can be flagged
> to be added to the dictionary)

Okay.

> - the measurements are in consistent units

Sometimes. My favorite example is "2 volumes EtOh"- I think I saw this
on openwetware.

> - steps are listed as individual bullet or number points

Not always. Sometimesyougetblocksofparagraphsandmangledtextlikethisandyouneedtobeanexpertatenglish.

> = the lexical space is relatively small.

It seems to vary widely to me. Can you show me a subset that has a
small lexical space?

> Writing lexical analyzers can be really quite fun, you might enjoy it
> a lot.  Long ago I wrote a couple talking automatons (s/w robots) that
> would chat each other up (as well as any other chat users) in a chat
> system based on an invented dictionary of dialogue with verb-noun
> parsing, in a language that a friend of mine invented as a combination
> of perl and C (it was basically C with strings built in), which was
> compiled in his compiler written using lex.   I guess there's not too
> many people around anymore who know lex or it's precursors.

Back in the day I was doing something like that, except I was using
chatbots that were built around regular expression matching and
regular expression manipulation. I thought I was the king after that
:-). They weren't very bright, IMHO. ;-)

> At least it's worth looking into the complexity of this as a solution
> before trying to shoehorn bio into any-ML at the user-entry level.
> *ML is a computer-to-computer interchange format and should ideally
> never touch the screen or come from a user's keyboard (the exception

That's fine, we can make it hidden under the scene- either through
wizards or graphical programming interfaces (a recommendation from
this thread) or other possibilities. Another possible wizard would be
one that helps interpret a particular file and asks which parts are
related to which parts and thus how to generate the grammar that would
be put into flex- i.e., it would ask which characteristics are related
to a new step, and which actions are referencing different procedures,
and whether or not those procedures match any particular open source
hardware package (i.e., a step that says "do PCR here using
thermocycler" (I know, this is a bad way of saying it) and then an
option of looking through the local inventory for compatible
machines). Maybe, I'd believe this working- now, how would you go
about asking the user how to parse the text that they pasted? There
would be a few common different blocks that we're expecting, and then
each block has some delineation method- if you've ever imported a
dataset into OpenOffice Calc or any other spreadsheet application,
there tends to be a mechanism by which the program distinguishes the
symbol of delineation between data points in a data set, which is
pretty nifty- something like that could be used here, and then applied
to a regular expression (or lexical parser generator thingy),
especially with regexp specialties like $ and ^ and so on.

> being the developers themselves).  Especially ML should never reach
> the biologists.. they're timid & scare easily.  ;-D

did you just insult biologists??

Jake

unread,
Apr 17, 2009, 4:13:21 AM4/17/09
to DIYbio
I think you're right Nathan. We can't get bogged down in computer
mumbo-jumbo. 90% of this thread is already unproductive discussion
about computer languages and file formats. That has nothing to do
with actually designing and building a thermocycler.

I think it's great what Bryan is trying to do, and I'm sure that is
the future direction of science. But who is going to actually order
the parts? Who is going to actually solder them onto a breadboard?
Who is going to program the software? Who is going to input the
program? Who is going to decide what DNA to PCR up? Who is going to
load the tubes? Who is going to start the cycle? Who is going to
remove the tubes? Who is going to clean up the rxn mix?

The answer to all of those questions is: Me! A computer isn't going
to do any of that for me. Now it sure would be nice if all we had to
do was come up with an idea like "Make a glow-in-the-dark worm so that
fish will bite on it better," and encode it into an XML file and a
computer would figure out all the lab procedures, order the cDNA and
reagents, design a thermocycler to the required specs, order the parts
for it, walk you through the construction, program it, help you load
it, and so forth. But the reality is that it's not going to do any of
that.

So I don't see the actual value of any of this talk about putting
things into certain machine-readable formats. Maybe I'm missing
something, if so please try to educate me. What I'd like to know is
what this computerization of everything is actually going to do to
help the project?

It would be great if someone would take the work and encode it in a
useful format. But how is it actually going to help get this thing
designed and built? The only way I know how to do things is to look
at what other's have done, borrow and modify their designs,
collaborate with the electronics people and programmers, come up with
a prototype design, and build it. Plenty of software tools may help
along the way, but nothing I know of is going to do ANY of the actual
work!


Someone mentioned a target cost... I think it should just be as cheap
as possible. We allready know the basics of what it has to do. And
there are quite a few ideas for features that won't add hardly
anything to the cost.

When it comes to USB vs serial vs ethernet vs wireless we need to keep
the point of the project in mind and use the cheapest possible way.
I've built a serial interfaced microcontroller board before, so I know
that's one way that will work just fine. When the group looked at USB
we decided it was too much more money and too much design time to
justify. That was quite awhile ago, so maybe USB is cheaper nowdays.
One thing we looked at was that we'd have to do extra programming to
use the USB interface rather than just debugging in hyperterminal and
implementing a simple serial data transfer protocol for the data. If
we can find a cheap microcontroller with onboard USB that would be
great. We have to consider the building time into the cost also. If
it costs an extra $5 to get everything on one chip vs a bunch of
cheaper parts that have to be soldered in individually then it will be
worth it. I've seen USB devices with memory, microcontroller, and all
kinds of other cool stuff made on a PCB the size of a postage stamp.
That has to be cheaper than a bag full of parts soldered onto a
postage card sized PCB.

A couple posts were about integrating a webserver. I don't see how
that would be of any use in a thermocycler. You put in your tubes and
start the program. When it's done you take out your tubes and procede
to your next step. It's not something you need to control over the
web. There's really nothing you can do with it on the web, if there's
no tubes in it there is no point to start it. And when you put your
tubes in you start it right away. I can't think of a situation where
you'd load it up with the intention of starting it from a different
location at a later time. If anyone needs something like that I'm
sure a simple software timer would work just fine.

As was mentioned there's also security concerns. And it could be just
as bad as stopping a pacemaker. Suppose someone decides to log into
my thermocycler at 3am and turn it up to full heat? They do get
pretty hot (at least boiling temps). There's every chance that it
could start a fire. I could easily see it melting down and catching
fire if it were run long enough. It probably wouldn't be hard to at
least ruin the thermocycler.

So if it had some sort of network connection we'd have to worry about
all sorts of security measures and failsafes. That goes against the
idea of cheap. As for the slickness of a web interface... that would
be nice, but I'm sure someone will make a nice java or vb program to
operate it if we come up with the hardware.

I think the first thing to figure out is the computer -> cycler
interface (serial, USB, etc.) along with the microcontroller and PCB
(motherboard). The lights, switches, fans, peltier junction(s), etc.
seem pretty simple and straightforward to me. I'm plenty happy to
tear apart my thermocycler and figure out what it's doing and how it's
doing it. I would love to smash it's shitty, archaic interface into a
million bits with a hammer and hook it up to an open thermocycler
mainboard instead.

Maybe someone with modern uC experience can chime in with some help?


-Jake


Josh Perfetto

unread,
Apr 17, 2009, 7:39:39 AM4/17/09
to DIYBio Mailing List, Jake
Hi Jake,

I agree with you that some of this talk has gotten too far ahead of itself,
and the immediate need is to construct the hardware. I disagree about some
things though.

From an engineering perspective, I think a computer interface simplifies the
design. There is no need to engineer some sort of on-device programmable
interface with a small LCD and what will inevitably be a very awkward,
difficult interface. It would be much easier to create a more usable
interface in software, you will save on hardware cost and design time, and
this interface will later enable the more advanced applications that are
being discussed.

Also, I don't think the goal should be to construct a device for as cheap as
possible. Anyone looking to do PCR as cheap as possible can already do so
with 3 water baths. I'd suggest an alternate goal of building something
which a maximal amount of people will use, which is a trade-off of not only
cost, but also usability. The primary users of this open thermocycler
machine will not be EE enthusiasts but bio enthusiasts.

With regard to interfaces, ethernet suits my needs best, but I realize it is
probably not the optimal trade-off for this decision. However I think
serial is also very sub-optimal, especially considering how many modern
computers do not even come with this interface anymore. As others pointed
out, USB boards are now ubiquitous and relatively inexpensive, and people
like me who desire ethernet (mainly so it wouldn't be necessary to have a
dedicated computer attached) could always use some ethernet USB server (I
realize there are ethernet-enabled serial "terminal servers" but those are
prehistoric :) ).

-Josh

Bryan Bishop

unread,
Apr 17, 2009, 11:20:17 AM4/17/09
to diy...@googlegroups.com, kan...@gmail.com
On Fri, Apr 17, 2009 at 3:13 AM, Jake <jake...@mail.com> wrote:
> I think you're right Nathan.  We can't get bogged down in computer
> mumbo-jumbo.  90% of this thread is already unproductive discussion
> about computer languages and file formats.  That has nothing to do
> with actually designing and building a thermocycler.

I don't know how you're measuring whether or not something has to do
something about designing a thermocycler, but CAD is still an
important aspect of engineering. It allows for the machine to be more
repeatable and more describable. It means it's not just some hack, and
instead it's something that has a specific set of specifications,
transmitted for engineering- for building- for making stuff work.

> I think it's great what Bryan is trying to do, and I'm sure that is
> the future direction of science.  But who is going to actually order
> the parts?  Who is going to actually solder them onto a breadboard?
> Who is going to program the software?  Who is going to input the
> program?  Who is going to decide what DNA to PCR up?  Who is going to
> load the tubes?  Who is going to start the cycle?  Who is going to
> remove the tubes?  Who is going to clean up the rxn mix?
>
> The answer to all of those questions is: Me!  A computer isn't going
> to do any of that for me.  Now it sure would be nice if all we had to

How do you know that a computer isn't going to do any of that? Are you
a programmer? Have you seen websites that already order parts, have
you seen videos that teach you how to solder? Have you seen machines
that preconfigure other machines for lab experiments? It just sounds
like you've not considered any of ithis.

> do was come up with an idea like "Make a glow-in-the-dark worm so that
> fish will bite on it better," and encode it into an XML file and a

Do fish really bite better on glow-in-the-dark-worms? Since, you know,
they are glowing, and worms don't normally /do/ that.

> computer would figure out all the lab procedures, order the cDNA and

By the way- your computer already orders cDNA and reagents, you're
just the one clicking the browser buttons. Just imagine doing that
without clicking- it's a very simple computer trick.

> reagents, design a thermocycler to the required specs, order the parts

Ever order anything over the web from Amazon? etc.

> for it, walk you through the construction, program it, help you load

Ever read instructons for a lego kit?

> it, and so forth.  But the reality is that it's not going to do any of
> that.

How do you know?

> So I don't see the actual value of any of this talk about putting
> things into certain machine-readable formats.  Maybe I'm missing
> something, if so please try to educate me.  What I'd like to know is
> what this computerization of everything is actually going to do to
> help the project?

In truth, I'm not really asking anybody to do anything new- I'm the
one who has been putting the work into this stuff that you think is
impossible (I guess it's all the more awesome if I make something work
that you otherwise consider impossible). But what I do want to see
happen is CAD files of the designs, and so on, and Jonathan Cline's
grammar parser if he's working on that, because that's really really
exciting and almost worth its weight in gold, or however you say that.

> It would be great if someone would take the work and encode it in a
> useful format.  But how is it actually going to help get this thing
> designed and built?  The only way I know how to do things is to look

CAD files can be used to help build things by telling other machines
where to cut or how to remove metal. And in the case of additive
manufacturing, there are other ways to build plastic components (goo
squirters, anyone?). And of course ultimately there's the classic go
over to the bench and saw some wood or something, yeah.

> at what other's have done, borrow and modify their designs,

How do you modify their designs if they are just JPEGs? At that point
you're reconstructing a lot of previous effort, and you're wasting
time.

> collaborate with the electronics people and programmers, come up with
> a prototype design, and build it.  Plenty of software tools may help
> along the way, but nothing I know of is going to do ANY of the actual
> work!

That's why I'm building it. We're making it happen. Thank goodness we
live in the future.

> Someone mentioned a target cost...  I think it should just be as cheap
> as possible.  We allready know the basics of what it has to do.  And
> there are quite a few ideas for features that won't add hardly
> anything to the cost.

When other projects have some expensive component, I most often see
some alternative component that could be switched in if you want to be
ridiculously cheap about the whole thing. So something like that might
be appropriate here, especially if USB/ethernet/SD turns out to be the
price inflation culprit.

> When it comes to USB vs serial vs ethernet vs wireless we need to keep
> the point of the project in mind and use the cheapest possible way.
> I've built a serial interfaced microcontroller board before, so I know
> that's one way that will work just fine.  When the group looked at USB
> we decided it was too much more money and too much design time to
> justify.  That was quite awhile ago, so maybe USB is cheaper nowdays.
> One thing we looked at was that we'd have to do extra programming to
> use the USB interface rather than just debugging in hyperterminal and

I'm still confused about this. Do you guys really use hyperterminal?
What operating system are you using??

> A couple posts were about integrating a webserver.  I don't see how
> that would be of any use in a thermocycler.  You put in your tubes and
> start the program.  When it's done you take out your tubes and procede
> to your next step.  It's not something you need to control over the
> web.  There's really nothing you can do with it on the web, if there's
> no tubes in it there is no point to start it.  And when you put your
> tubes in you start it right away.  I can't think of a situation where
> you'd load it up with the intention of starting it from a different
> location at a later time.  If anyone needs something like that I'm
> sure a simple software timer would work just fine.

Streaming data over the web, repairing the software over the web,
diagnostics- what if I have to fix diybio NYC's thermocycler and I am
all the way over here in Austin? There are other advantages, have you
ever used a print server?

> As was mentioned there's also security concerns.  And it could be just
> as bad as stopping a pacemaker.  Suppose someone decides to log into
> my thermocycler at 3am and turn it up to full heat?  They do get
> pretty hot (at least boiling temps). There's every chance that it
> could start a fire.  I could easily see it melting down and catching
> fire if it were run long enough.  It probably wouldn't be hard to at
> least ruin the thermocycler.

There are security measures that we could implement to prevent that.
However, I agree that an external interface- to the outside world- is
probably a bad idea. However, having a presence on the internal
network is still game, IMHO.

> So if it had some sort of network connection we'd have to worry about
> all sorts of security measures and failsafes.  That goes against the
> idea of cheap.  As for the slickness of a web interface... that would

Nah, security software is free.

> be nice, but I'm sure someone will make a nice java or vb program to
> operate it if we come up with the hardware.

VB? What operating system are you using??

Tito Jankowski

unread,
Apr 17, 2009, 11:36:09 AM4/17/09
to diy...@googlegroups.com
Great, glad to see so different areas of interest covered.

At this point, an important step is to design Requirements -- what the user wants to do. Jake, Josh, Nathan -- as biologists, you are the user, and you are big contributors to this phase.

We need more about the "what" of the Open Thermal Cycler -- it looks like we have a lot of options for "how".

Jake, Josh, Nathan -- please talk more about that. Could you walk us through one of your experiments that involved a thermal cycler? Make the assumption that we know nothing about biology.

Tito

PS

I'm speaking at CodeCon in San Francisco this Saturday about open tools for biotech -- including the open gel box and the beginnings of this project.

JonathanCline

unread,
Apr 17, 2009, 12:06:43 PM4/17/09
to DIYbio
On Apr 16, 10:52 am, Nathan McCorkle <nmz...@gmail.com> wrote:
> It seems like you agree that it's too tough, and then say how the USB to
> serial driver is shipped with most every OS standard... and list that even
> java can access it natively, that sounds easy and desirable.


That's right. Like I said, " The saving grace is "...

There are open designs which eliminate most of the work of using USB,
by providing USB-to-serial functionality within the microcontroller
itself (for low bit rates), and these designs can be built pre-
assembled with data & power routed to the microcontroller. So the
work is already done. Whatever anyone's building should probably use
one of these open designs.

Adding USB to some *arbitrary* design (like some favorite motorola
chip the hacker has hacked before) which would require using a
separate USB chip (phy) and writing a custom USB driver, etc (i.e. the
method the original poster was referring to), is beyond most amateur
DIY. It is built-from-scratch. It is difficult and time consuming.
Which is why he suggested using a USB-to-serial external 'module'
instead of dealing with USB chips. But none of that is necessary if
using one of the open designs (like UBW or arduino or presumably
others) as I described, and using them is rather simple. That is my
point.


## Jonathan Cline
## jcl...@ieee.org
## Mobile: +1-805-617-0223
########################


JonathanCline

unread,
Apr 17, 2009, 12:10:51 PM4/17/09
to DIYbio
On Apr 17, 3:13 am, Jake <jakes...@mail.com> wrote:
> I think the first thing to figure out is the computer -> cycler
> interface (serial, USB, etc.) along with the microcontroller and PCB
> (motherboard).

Go here, please: http://88proof.com/synthetic_biology/blog/archives/tag/microcontroller

Jake

unread,
Apr 17, 2009, 5:05:06 PM4/17/09
to DIYbio
Jonathan, thanks for the update to the USB issue. Since it's
basically going to be serial data anyways maybe it's easiest to just
do it that way and use a $5 serial -> USB cable. I saw on one arduino
site that they decided to put the communications chip in the cable to
save costs. They have a TTL -> USB cable. Is this the same as a
serial -> USB cable or does it save even more parts?

See: http://www.littlebirdelectronics.com/products/bare-bones-arduino-kit

They call it a "FTDI TTL-232R USB-to-TTL serial cable".
http://www.littlebirdelectronics.com/products/usb-ttl-serial-cable

Thanks for also reminding me about drivers. With my microcontroller
project we had a serial interface and I was going to try writing a
driver and some simple software for doing certain functions. I never
was able to figure out the driver part though.

One thing was also thinking of is using LabView as the interface
software. Labview makes it pretty easy to do all sorts of stuff and
display the output. It's drag-and-dropish so it might be really handy
for playing around with the interface. It would also give the
functionality of being able to be part of a larger process. Someone
with Labview at their university could design the interface and then
use the application builder to make a standalone app that would serve
as the thermocycler interface. That way we could make a lot of
different and very slick interfaces for using it.


Josh said:
"From an engineering perspective, I think a computer interface
simplifies the
design. There is no need to engineer some sort of on-device
programmable
interface with a small LCD and what will inevitably be a very
awkward,
difficult interface. It would be much easier to create a more usable
interface in software, you will save on hardware cost and design time,
and
this interface will later enable the more advanced applications that
are
being discussed."

Well put. My thoughts exactly.

> Also, I don't think the goal should be to construct a device
> for as cheap as possible.

I do think we should focus on this. If we make a really cheap device
it will be available to more people. We also might attract people
that otherwise would just buy a commercial unit. If our device is
half the price of a commercial unit some small labs might just decide
to go with the commercial unit. But if ours is 1/10th the price of a
crummy commercial unit and it has a lot more features I think a lot of
people would almost have to at least give the open thermocycler a try.

I really don't see what features would really cost that much. With
the uC's that have been mentioned I don't see anything that couldn't
be done. All I'm suggesting is that the base design should be as
cheap as possible. The biggest cost factor that is adjustable seems
to me to be the block and peltier junction. And each user could
decide what they need for that. If you used a small block with small
peltier junctions you would save quite a bit of money and your cycler
would be more responsive. I actually see this as a feature since it's
a small thermocycler, doesn't take up much bench space, and is highly
responsice with good temperature control.

Now if you use a bigger block it will cost you more and you'll need
bigger peltier junctions, bigger fans, bigger heatsinks, a bigger
power supply, the unit will be bigger, etc.. Most of the time I think
people are only going to need to run a couple tubes at a time. Unless
you're doing diagnostic screening you probably don't need a lot of
tubes per run. Even then you could do some sample pooling for batch
screening and reduce that requirement.

I'd like to see a small unit right next to my computer. Unless we
have some sharp engineers here that are good at thermal modeling we
might well underdesign our hardware. If the block is too big we might
find out that the device can't cycle temps fast enough, or that we
need more power than we can switch, or more p-junctions, etc.. Or
simply that the cost gets out of hand. At least if we start with the
"as cheap as possible goal" we'll have a point where we can say "this
device is as cheap as can reasonably be made". Then we can expand
from there using the cheap platform as a starting point. Otherwise
someone down the road might decide they need it cheaper and have to
start from scratch again.

> Anyone looking to do PCR as cheap as possible can already do so with 3 water baths.

I'm hoping that we can make this device cheaper (and with better temp
control) than the cost of 3 water baths. Now I know you can just use
a burner and a pot of water, but I don't think you can really get the
level of temp control required to really do a good job, and
thermostatically controlled waterbaths aren't cheap.

> I'd suggest an alternate goal of building something which a maximal amount
> of people will use, which is a trade-off of not only cost, but also usability.

I'm not sure what features would cost all that much. I'm not
suggesting hacking it together with duct tape and chewing gum. "As
cheap as possible" to me means as cheap as you can do it while making
it at least as good as a commercial unit.

So we need to find out if people really need a lot of tube space. How
many do people really want to run at once? Since you usually adjust
your program based on what you're doing I think that most of the time
you have only a couple tubes per run.

I think we need to figure out how many tubes people need to run. We
should also check out what size p-junctions are cheap and available
off the shelf. Then that will give us our optimal block size for
cost, then hopefully we can reconcile the two numbers.

American science and surplus has peltier junctions 1-9/16" sq. x 3/16"
thick for $20.95 each. That size isn't going to run a very large
block.

One thing I thought of is just make the cheapest unit modular! So say
you have a 4-6 tube block. Now instead of spending more money on
larger p-junctions, etc. just put two 4-6 tube blocks in your unit.
Then we'd have a cheap and modular design. IMO if you had a unit that
could run 4 blocks X 4 tubes each, and run a different cycle on each
one... WOW! That would be a real time saver. It would be WAY more
useful than one big thermocycler block like most units you see.

> I think serial is also very sub-optimal, especially considering how
> many modern computers do not even come with this interface anymore.

That's true, serial is a "legacy" interface that isn't going to be on
modern computers anymore. However you can get a cheap serial -> USB
cable to solve this. You can also get a serial port card, or you
could use an older computer. We really don't need much processing
power for this application.

I don't know enough about the current state of USB to really say much
about it. I do know that USB drivers are a bitch to write. I do know
that serial is plenty fast for our needs, and we can start using
hyperterminal with it right away rather than have to wait for someone
to write a driver before we can even start playing with it.

Tito said:
> Great, glad to see so different areas of interest covered.

Me too! I'm glad that it seems a lot of skilled people seem
interested in the project.

> Could you walk us through one of your experiments that involved a thermal cycler?

There are a lot of different uses for PCR. They all have different
aspects, so we really need to concentrate on making sure that our
device has all the features needed for all of them.

They all come down to amplifying DNA. It's kind of hard to go into
all the specific uses. It really just needs to cycle temperature for
you with tight temp control and fast cycle times. In a genetic
engineering example you would first identify your gene of interest and
look up it's sequence. Then you would take a sample of the tissue,
and extract the genomic DNA. You would design primers from the
sequence data. Then you would look at the GC content of your primers
and template DNA. Using that data (see earlier post for a general
formula used to estimate melting temp) you would program the cycle.
You then add Taq (or another) polymerase along with a buffer and other
components to your genomic DNA and run the cycle. Nowdays they just
call most of that the "PCR master mix" so you don't really have to
worry much about that. After the cycle is done you should have a
bunch of your DNA of interest with some genomic DNA left over. You'd
usually run it on a gel and cut out the bit that is the size that you
want. Then you can run it again with that more purified DNA or it
might be good enough already. You would use that DNA and ligate it
into a vector. At that point you can use your host to amplify the
vector as much as you want.

That would be one standard use. You can also use it for diagnostic
screening. In that case you would be trying to amplify DNA to check
if it's there. Maybe you are looking for HIV in a blood sample.

There's also reverse transcription for preparing cDNAs, tailing for
adding sequences on the ends to make vectors, etc.

I think gradient PCR is too hard to implement in a cheap machine.
You'd have to have more p-junctions and more sophisticated temp
control I would think. However, if you had a modular machine with
multiple blocks you could do pretty much the same thing.
Well I think it's up to the people who will actually be programming
this thing to decide the best way to do it. I played around quite a
bit with a serial system designed by someone else. I can do a bit of
programming of the uC to make it send the data in the format I want,
but otherwise I'm pretty clueless as far as the electronics and driver
parts.

I'm pretty hand at building and testing electronics if someone else
designs them. I'll also be happy to muck about with uC code and get
the data I want sent to a computer. I have a thermocycler which I'll
be happy to tear apart and see what it's doing.

I kind of like the idea of using arduino parts. I think we should
look at and choose a platform based on that. That way we can use a
prefab motherboard and get it working. Later we can look at a custom
PCB and moving parts onto that PCB. The arduino guys have already
done a lot of work that we can use. The one that I was looking at
was:
http://www.littlebirdelectronics.com/products/bare-bones-arduino-kit

It seems reasonably priced and uses all through-hole components. That
should be a good platform to start from. We might then need to put
together a daughter board for some of the external components (relays,
temp sensor parts, etc.), or we can kind of just cobble them together
for prototyping (or use a protoboard hooked up to the uC board). Then
once everything's worked out we can get a custom PCB manufactured that
would hold the uC, and all the other needed parts.

Am I on the right track here or is there a better way?


-Jake

Jake

unread,
Apr 17, 2009, 6:02:49 PM4/17/09
to DIYbio
> I'm still confused about this. Do you guys really use hyperterminal?
> What operating system are you using??

I brought that up because that's what I'm currently using with my
spec. I'm using WinXP with hyperterminal to get data from the spec.
It has a serial interface and I use a serial -> usb cable to get the
data. What I do is open hyperterminal, do my scan, then it asks to
send the data, I go in hyperterminal to capture (I haven't done it for
awhile, do I'd have to do it again to tell exact details) then I send
the data. On the screen up comes some date/time/user info then
[wavelength] [TAB] [absorbance] [CR]. It fills a screenful or so
based on what wavelengths I scanned. Then I go to save file. Then I
have this text file. Now I open Excel and import the data to a spread
sheet. I have to check some boxes and other bullshit to get it
imported in the right format each time, which is a hassle. Once it's
in I have two columns of numbers, once for wavelength and one for
absorbance. Then I put one on the x-axis and one on the y-axis and I
get a graph of absorbance vs. wavelength.

Once I've punched all this in I just import the next scan into a new
file, then cut and paste from one to the other and it changes the
graph data. This works pretty well becuse Excel can do some pretty
fancy graphing. I can overlay two scans on top of each other, maybe
subtract one from the other to see the difference, and/or I can adjust
the lines so they match up by programing the cells to say -10 from
each reading of one of the scans.

This works just fine, but it's a hassle. Getting the data is the pain
because I have to use hyperterminal, save the data to a text file,
then import it (using the right options), then cut and paste it from
that spreadsheet into the one that is already set up with the graphs,
etc.. It would be really nice if there was a way to automate this so
that the serial port is read directly into an Excel file.


My point by bringing this up is that major companies aren't doing
anything fancy a lot of times (I have a bio-rad smart-spec). If all
the thermocycler uC does is read out a list of numbers into
hyperterminal then we can put that into Excel and get a graph of time
vs. temp to verify what is happening. Or whatever else we need.

I'm just putting what I already know on the table for comments. I
know how to solder together a uC board, solder on a serial cable,
program the uC to get the data I want, read the serial data in
hyperterminal, import that into Excel, and produce a nice looking
graph of the data.

I'm sure that someone knows how to do this better than me. However I
think it can be done how I mentioned and that might be a good starting
point to build upon.

My thermocycler interface is a POS! I'll take a clumsy hyperterminal/
serial interface over it any day of the week. And if it works then
who cares if it's a little clumsy at first. It'll be cheap and
workable. If I can save a $900 dollars on a thermocycler by using
simple off-the-shelf components and interfaces I'll happily do it.


As for the rest of your post... Show me the money! I can't really
spend a bunch of time learning computer formats and CAD file standards
unless it helps me right now. I think it's great if you want to
compile the specs and whatnot into CAD files and XML and whathaveyou.
I just have no idea how to start or what it would accomplish.

I think it's great and reasonable to use solidworks or some cad
program to design and model the block. I just don't know about
getting it made like this. When I want a block for my system there's
about a 95% chance that I'm going to pull one out of another
thermycycler or the like.

Failing that I'd take it to my local machine shop with drawings and
specs and measurements and actual tubes to make sure they fit snug.

But my local machine shop guy wears greasy coveralls and says stuff
like "Get er dun!". If I start talking about XML files he's just
going to tell me that it's too fancy and either charge me a buttload
of cash (and throw the XML files in the trash) or tell me that he
doesn't do that sort of thing.

So I see a bit of disconnect between the real world and the idea of
all this computerization.

If there is a place that takes CAD files and produces an aluminum
block to your specs that's great. I'd love to use it. But I know
that the local "Get er dun" guy will give me a reasonable price and
work with me to get it right. He's right down the street and I don't
have to order 10,000 units for him to even talk to me.


-Jake

P.S. As for the glow worm that's debatable. I do know that deep sea
fish use glowing worm-like baits. And I doubt if a fish's black and
white vision and pin sized brain will care if it looks normal. I
think the main factor is just seeing an object that looks like what
the fish is used to eating.

Josh Perfetto

unread,
Apr 17, 2009, 6:16:32 PM4/17/09
to DIYBio Mailing List, Jake
On 4/17/09 2:05 PM, "Jake" <jake...@mail.com> wrote:

>> Also, I don't think the goal should be to construct a device
>> for as cheap as possible.
>
> I do think we should focus on this. If we make a really cheap device
> it will be available to more people. We also might attract people
> that otherwise would just buy a commercial unit. If our device is
> half the price of a commercial unit some small labs might just decide
> to go with the commercial unit. But if ours is 1/10th the price of a
> crummy commercial unit and it has a lot more features I think a lot of
> people would almost have to at least give the open thermocycler a try.

I agree that the cost should be something like 1/10th the cost of a
commercial system, but that's different than saying "as cheap as possible"
and driving every design decision by cost alone. That's where you get into
sacrificing features and usability to make the system 1/10.5th the cost,
which isn't necessarily worth it. That's why I said design something that
the greatest number of people will use, which certainly reflects cost.
Actually a better way to state it might be "Design something a greatest
number of current non-thermocycler users will use" (i.e. Greatest number of
new users)

>> Anyone looking to do PCR as cheap as possible can already do so with 3 water
>> baths.
>
> I'm hoping that we can make this device cheaper (and with better temp
> control) than the cost of 3 water baths. Now I know you can just use
> a burner and a pot of water, but I don't think you can really get the
> level of temp control required to really do a good job, and
> thermostatically controlled waterbaths aren't cheap.

You can make a pretty cheap water bath with a fish tank heater or two.

> I'm not sure what features would cost all that much. I'm not
> suggesting hacking it together with duct tape and chewing gum. "As
> cheap as possible" to me means as cheap as you can do it while making
> it at least as good as a commercial unit.

I agree that making something at this price point should be doable. I was
referring more to marginal costs like serial/usb. To mean "As cheap as
possible" is different than "As cheap as possible while making it at least
as good as a commercial unit" - but the later would be a perfect design
goal!

> I think we need to figure out how many tubes people need to run. We
> should also check out what size p-junctions are cheap and available
> off the shelf. Then that will give us our optimal block size for
> cost, then hopefully we can reconcile the two numbers.

I usually run about 12 max, what do others do? Has the discussion of a
heated vs non-heated lid come up?

-Josh


Nathan McCorkle

unread,
Apr 17, 2009, 6:17:40 PM4/17/09
to diy...@googlegroups.com
So on the hyperterminal issue.... if it's sending a simple string, you could use java to parse the tabs and CRs, and I dunno if GNU plot is available for windows, but on linux I would just write the data as a fileto throw into GNUplot.

otherwise, why not just go arduino?

http://arduino.cc/en/Main/Hardware

or something equivalent... what kinds of uCs can ppl here program? PICs? I can't do that... I do have a BX-24 dev kit and a serial to USB cable for it.

Nathan McCorkle

unread,
Apr 17, 2009, 6:19:44 PM4/17/09
to diy...@googlegroups.com
Oh, and my school ends in about 4 weeks, so if I could get a CAD file of an aluminum block, I could do a few before I head home... I dunno if the Comm College there has a machine shop... I know we also have a rapid prototyping machine (or few) ... never used them tho

Bryan Bishop

unread,
Apr 17, 2009, 6:29:33 PM4/17/09
to diy...@googlegroups.com, kan...@gmail.com
On Fri, Apr 17, 2009 at 5:02 PM, Jake <jake...@mail.com> wrote:
>> I'm still confused about this. Do you guys really use hyperterminal?
>> What operating system are you using??
>
> This works just fine, but it's a hassle.  Getting the data is the pain
> because I have to use hyperterminal, save the data to a text file,
> then import it (using the right options), then cut and paste it from
> that spreadsheet into the one that is already set up with the graphs,
> etc..  It would be really nice if there was a way to automate this so
> that the serial port is read directly into an Excel file.

How about this? A program that will automatically grab the data, save
it to multiple open standards file formats (I'm not too interested in
Excel, but if you want to use Excel after an OpenOffice step or
something, that's cool), and then some automatic graphing with
gnuplot. Heck, it's made for this. I don't even mention this stuff
normally because it just seems so ridiculously simple to me ..
dragging-and-dropping into stuff like excel just feels archaic and a
step backwards. Labware can get away with really really old stuff
because the people that use the equipment don't necessarily know of
how better things are on the outside world.

> I'm sure that someone knows how to do this better than me.  However I
> think it can be done how I mentioned and that might be a good starting
> point to build upon.
>

> As for the rest of your post... Show me the money!  I can't really
> spend a bunch of time learning computer formats and CAD file standards
> unless it helps me right now.  I think it's great if you want to
> compile the specs and whatnot into CAD files and XML and whathaveyou.
> I just have no idea how to start or what it would accomplish.

Here's how you start. You open up a CAD app, like HeeksCAD, and start
drawing your machine. No more bloody JPEGs. Then go to file->save.
Then upload the file to some publicly accessible place. There are
tutorials on this that I'll assemble in a future email for
CAD-related-links and tutorials and such.

> I think it's great and reasonable to use solidworks or some cad
> program to design and model the block.  I just don't know about
> getting it made like this.  When I want a block for my system there's
> about a 95% chance that I'm going to pull one out of another
> thermycycler or the like.

If you're making a hacked thermocycler made up of other parts, then
yeah I don't expect you to magically have a CAD file of the
components, but it would be nice. That's like a first-year reverse
engineering project that they don't teach you at a university anymore
(it's one of those introductory projects that people just end up doing
anyway, sort of thing). I don't know what you mean when you say you
don't know how to get it made "like this".

> Failing that I'd take it to my local machine shop with drawings and
> specs and measurements and actual tubes to make sure they fit snug.

Those drawings can be generated from 3D CAD programs, by the way.

(Are you sometimes meaning to say spectrophotometer instead of specifications?)

> But my local machine shop guy wears greasy coveralls and says stuff
> like "Get er dun!".  If I start talking about XML files he's just
> going to tell me that it's too fancy and either charge me a buttload
> of cash (and throw the XML files in the trash) or tell me that he
> doesn't do that sort of thing.

Maybe you need better machinists? But seriously, I don't understand
how you think that XML can't be converted into other formats. I also
have not specifically said anything about an XML format for CAD files.
I think you're making this harder than it really is.. XML for
protocols is one thing, but I didn't mention XML for CAD.

> So I see a bit of disconnect between the real world and the idea of
> all this computerization.

I think you're the disconnect at this point. :-/ Sorry man.

Meredith L. Patterson

unread,
Apr 17, 2009, 6:52:39 PM4/17/09
to diy...@googlegroups.com
On Sat, Apr 18, 2009 at 12:17 AM, Nathan McCorkle <nmz...@gmail.com> wrote:
> So on the hyperterminal issue.... if it's sending a simple string, you could
> use java to parse the tabs and CRs, and I dunno if GNU plot is available for
> windows, but on linux I would just write the data as a fileto throw into
> GNUplot.
>
> otherwise, why not just go arduino?
>
> http://arduino.cc/en/Main/Hardware
>
> or something equivalent... what kinds of uCs can ppl here program? PICs? I
> can't do that... I do have a BX-24 dev kit and a serial to USB cable for it.

I program Arduinos all the time, and it's not that big of a jump from
the Arduino to a plain old AVR. For that matter, the Arduino
bootloader can be installed on an AVR so that new firmware can be
developed and uploaded easily.

I've also worked with Freescale uCs, which are really straightforward.

All this talk about hyperterminal and serial cables is making me hold
my head and groan. Seriously, fuck hyperterminal. What's the point? We
can easily add an RJ-45 socket to the *board*, throw a small TCP/IP
stack onto the microcontroller, set the microcontroller up to receive
commands over the ethernet cable in HTTP, and have it return its
output in XHTML -- ie, XML that is also valid HTML.

This means that, as Tito suggested, we can have a "Web 2.0
thermocycler". Fuck shitty pushbutton interfaces on the thermocycler
itself. I want to be able to open up a web browser, point it at my
thermocycler's address on my home network, paste the sequence that I'm
amplifying into a form on that webpage, paste in the sequences for the
primers I'm using, and have the *thermocycler itself* compute the
exact right temperature and time settings for this particular PCR run,
and then click "Start". I also want to be able to watch realtime
updates of the data coming back from the thermocycler, in my browser,
and save it off to disk whenever I want.

And we *can* do this. I already know how and I'm sure Bryan and other
programmers on this list have an idea of how to do it.

This is why Tito is asking for User Stories of "what happens when I
use a thermocycler". Jake, you don't *need* to know CAD formats or how
to program in order for this project to work -- the programmers are
champing at the bit to make this happen. We want to make it so that
using a thermocycler is as easy as using Facebook.

Cheers,
--mlp

Jake

unread,
Apr 17, 2009, 7:06:24 PM4/17/09
to DIYbio
> Has the discussion of a heated vs non-heated lid come up?

Mine doesn't have a heated lid and I'm not sure exactly how they
work. I'm guessing they maintain a couple degrees higher than the
block so that water condenses back into the mix rather than forming on
the top. I think they are probably a nice feature and one that we
should include.

I agree with you Josh about the design. I think we are on the same
page. I figure that we at least need a uC and anything with a uC
controller will be able to do pretty much anything we want.

As far as serial vs USB... I don't see any advantage to the USB, so
if serial is cheaper or easier to program and implement then I think
we should go with that. Since you can easily and cheaply get a serial
-> usb adapter I can't really see any reason for using USB. Now if it
were cheaper to implement or easier to program then we should go with
the USB.

> I usually run about 12 max, what do others do?

I only run one or two at a time. What exactly are you running 12
for? I can see for diagnostics if you need to run 12 samples with the
same test you'll want 12 slots for tubes. But for most DIY stuff I'd
think that you'd only be running one or two things at a time. Maybe
two samples, a negative control, and a positive control.

I think this goes back to what size p-junctions are cheap and
available. Since a lot of people might just need a couple slots,
maybe we should just fit as many as we can on the block that fits on
the cheap p-junctions we can get. Then people that need more can just
add extra blocks to be controlled by the same uC.

> If I could get a CAD file of an aluminum block, I could do a few before I head home...

That would be great. I'll take measurements next time I'm home. If
we find the cheap p-junctions we can just take measurements from a
standard block sized to that. The hard part in my mind is going to be
getting the measurements of the eppendorf tubes and finding the right
drill bit type deal to make that type of hole so that it fits snugly.


-Jake

Josh Perfetto

unread,
Apr 17, 2009, 7:09:33 PM4/17/09
to DIYBio Mailing List, Meredith L. Patterson
I want this too. I can program microcontrollers and stuff but don't know
how to add the ethernet port to the board, and none of the Arduino boards
seem to have this. How would you go about doing this?

-Josh

Meredith L. Patterson

unread,
Apr 17, 2009, 7:10:00 PM4/17/09
to diy...@googlegroups.com
On Sat, Apr 18, 2009 at 1:06 AM, Jake <jake...@mail.com> wrote:
> As far as serial vs USB... I don't see any advantage to the USB, so
> if serial is cheaper or easier to program and implement then I think
> we should go with that. Since you can easily and cheaply get a serial
> -> usb adapter I can't really see any reason for using USB. Now if it
> were cheaper to implement or easier to program then we should go with
> the USB.

Microcontrollers that natively support USB are not appreciably more
expensive than microcontrollers that don't. It's literally a matter of
a few cents to a few dollars. I would be shocked if the difference
were more than $5.

--mlp

Meredith L. Patterson

unread,
Apr 17, 2009, 7:26:35 PM4/17/09
to Josh Perfetto, DIYBio Mailing List
The expensive ($40.00) way is the Arduino ethernet shield:

http://www.ladyada.net/make/eshield/
http://www.arduino.cc/en/Reference/Ethernet

which would be handy for prototyping, and I'll probably get one. $15
of that is the ethernet shield itself, and $25 of that is the WIZnet
WIZ811MJ module.

The cheap way that involves more design work is to design the board to
support ethernet, and include appropriate firmware support on the
firmware image. Here's an example of an ATmega16-based,
ethernet-enabled board using a surface-mount ethernet controller IC:

http://www.avrfreaks.net/index.php?module=Freaks%20Tools&func=viewItem&item_id=357

Here's another one on the ATmega88 using a DIP-package Ethernet
controller IC (easy to solder!):

http://shop.tuxgraphics.org/electronic/detail_avrwebserver.html

and I'm sure that there's a lot of open-source material about how to
do ethernet properly on the AVR. If my understanding is correct, any
microcontroller that speaks SPI will be able to handle ethernet just
fine, and the DIP Ethernet controller for the board linked above is a
whole EUR 4.25.

Cheers,
--mlp

Bill Flanagan

unread,
Apr 17, 2009, 7:27:36 PM4/17/09
to diy...@googlegroups.com, Meredith L. Patterson
Ethernet Module for ARDUINO & FREEDUINO Board $22 + 6.50 for shipping. 

http://cgi.ebay.com/Ethernet-Module-for-ARDUINO-FREEDUINO-Board_W0QQitemZ290303832042QQcmdZViewItemQQptZLH_DefaultDomain_0?hash=item290303832042&_trksid=p3286.c0.m14&_trkparms=72%3A1205|66%3A2|65%3A12|39%3A1|240%3A1318|301%3A1|293%3A1|294%3A50

If you can program the USB port, you can probably handle the Ethernet as well. It's got the entire TCP stack already built in; you just drive it. There's also a wifi version but it's newer and more expensive. The Ethernet board drops on top as a standard Arduino module. 

Alec Nielsen

unread,
Apr 17, 2009, 7:30:48 PM4/17/09
to diy...@googlegroups.com
Love the idea of a networked thermocycler, and real-time, easily accessible data.

With respect to the number of tube sockets, I've run a couple dozen reactions at once before. For high throughput assembly/analysis it's nice, but I think it's probably more useful for gradient PCR. To obtain the best yield (or sometimes the only yield) from PCR, the ability to do gradients is indispensable. I vote that the thermocycler have at least 12 sockets (ideally more) and be thermal-gradient-capable.

I have some experience controlling Peltier blocks, but not much. For the ability to do gradients, would we control many, smaller heat blocks? Or are there large Peltier units that can be controlled to have variable temperatures across it's face?

Alec

Meredith L. Patterson

unread,
Apr 17, 2009, 7:31:16 PM4/17/09
to Bill Flanagan, diy...@googlegroups.com
Beautiful, that's even cheaper. I had been thinking about building
something on top of uIP, but I'll happily take a built-in TCP stack.
:) Thanks, Bill!

Cheers,
--mlp

Bill Flanagan

unread,
Apr 17, 2009, 7:33:10 PM4/17/09
to diy...@googlegroups.com
I know it's easier to use Hyperterminal to talk to an RS232 port if you're already used to it but using the std USB interface to read and write data isn't a lot harder. With so few PC's being shipped with serial interfaces, USB may be a better long-term direction. But if you're doing the work and feel comfortable with serial, I'd do what works and leave it to someone who needs it or wants to contribute to DIY later. 

Cory Tobin

unread,
Apr 17, 2009, 7:34:46 PM4/17/09
to diy...@googlegroups.com
It seems like there are two different schools of thought as to what
should constitute a DIY thermocycler.

1) should have a minimal cost, be more consistent than hot baths, and
be targeted towards the non-current-thermocycler-user's needs.
2) should have more advanced features (eg. ethernet), possibly cost a
little more, maybe have more wells, targeingt the
current-thermocycler-user's needs.

If there's enough talent around here, might it be advantageous to have
2 separate DIY thermocycler projects? Just a thought.


-Cory

Bill Flanagan

unread,
Apr 17, 2009, 7:37:02 PM4/17/09
to Meredith L. Patterson, diy...@googlegroups.com
It's a great market. There are Chinese firms doing the Arduino boards now that there's a standard to support and a marke to sell into. God bless free enterprise. 

Meredith L. Patterson

unread,
Apr 17, 2009, 7:43:28 PM4/17/09
to diy...@googlegroups.com
I personally am equivalently comfortable with programming for serial
or USB, and I absolutely agree that USB is the way to go simply
because fewer and fewer computers have RS232 ports on them. I'm a Mac
user. My only options are USB and FireWire, unless I want to buy a
USB-to-serial dongle, which I don't.

Cheers,
--mlp

Josh Perfetto

unread,
Apr 17, 2009, 7:52:54 PM4/17/09
to DIYBio Mailing List, Jake
On 4/17/09 4:06 PM, "Jake" <jake...@mail.com> wrote:

> Mine doesn't have a heated lid and I'm not sure exactly how they
> work. I'm guessing they maintain a couple degrees higher than the
> block so that water condenses back into the mix rather than forming on
> the top. I think they are probably a nice feature and one that we
> should include.

Ones I've used have a heated lid. I've been led to understand that ones
without a heated lid will have condensation on the tops of the tubes unless
you put some type of oil on them, but I've never used these machines myself
so don't know if that's still true or there are other alternatives. I'd
prefer having a heated lid to having to put oil on PCR tubes. Probably a
low-tech heating element would suffice.

> As far as serial vs USB... I don't see any advantage to the USB, so
> if serial is cheaper or easier to program and implement then I think
> we should go with that. Since you can easily and cheaply get a serial
> -> usb adapter I can't really see any reason for using USB. Now if it
> were cheaper to implement or easier to program then we should go with
> the USB.

As Meredith just said, the cost difference between serial and USB is not
that much. Here are the advantages I see of USB over serial:

1. Many modern computers don't have serial interfaces, which would require
people to purchase separate USB to serial adapter cables at additional cost

2. This complicates the setup since there are additional steps and drivers
to install

3. These cables can be unreliable if you need more than one. I once
literally spent a combined total of two working days trying to get 3 such
cables to work on one PC. The drivers did not work well when there were
multiple cables, and kept causing intermittent blue screen crashes on
Windows. I eventually arrived at a combination of different brands that
would mostly work reliably after many trips to Fry's.

You may say big deal, this can be overcome. True, but it's just not worth
it given the minor cost of USB.

Here are the advantages I see of ethernet over USB:

1. Having each device use USB is not scalable. USB cables have a limited
distance & no switching infrastructure, and at a point connecting all of
these devices to a single computer is not practical due to physical space.
The PCR machine is not the end, there will be open spectrophotometer etc one
day :)

2. USB is not wireless, which means you can't run around your lab with a
MacBook and control everything.

3. This situation encourages the use of additional dedicated fixed computers
to control the lab devices, and now you have to deal with moving data
between those machines and the notebook machine you primarily use.

4. An ethernet & TCP/IP interface enables devices to be controlled from
multiple PCs, which is indispensible in a shared setting.

>
>> I usually run about 12 max, what do others do?
>
> I only run one or two at a time. What exactly are you running 12
> for? I can see for diagnostics if you need to run 12 samples with the
> same test you'll want 12 slots for tubes. But for most DIY stuff I'd
> think that you'd only be running one or two things at a time. Maybe
> two samples, a negative control, and a positive control.

I mostly do what you do with 4-6 tubes. 12 is the max that I've done with
any frequency, which was for gradient PCR.

-Josh


Jake

unread,
Apr 17, 2009, 8:00:50 PM4/17/09
to DIYbio
Meredith said:
"All this talk about hyperterminal and serial cables is making me
hold
my head and groan. Seriously, fuck hyperterminal. What's the point?
We
can easily add an RJ-45 socket to the *board*, throw a small TCP/IP
stack onto the microcontroller, set the microcontroller up to
receive
commands over the ethernet cable in HTTP, and have it return its
output in XHTML -- ie, XML that is also valid HTML."

And how much is this going to cost? What does it add to the project?

I'm a biologist, I don't give a damn how the thing communicates. Once
the programming is done I don't care if I open a webpage, or start a
program, or type the commands into a prompt. The damn thing could use
morse code to communicate for all I care. It is abstracted from me.
All I care about is that I can put in my program and somehow it makes
it to the thermocycler.

It's great that we actually have programmers here who want to make
this project great. But don't bog down the actual functionality
worrying about bells and whistles. I know if you had your way you'd
have a wireless thermocycler using cluster computing, bluetooth,
paging you on your cellphone, sharing data with every other one built,
interfacing to a central relational database, SQL, XML, fuzzy logic,
etc., etc..

To me that's just added complexity with no additional value. You
can't load a sample remotely so there is no reason to start it
remotely.

> This means that, as Tito suggested, we can have a "Web 2.0 thermocycler".

What's the cost? I have a hard time believing you can do that as
cheaply as a simple serial interface. Every bit of scientific gear
I've used either uses a simple serial interface or it uses a custom
program. I've even used stuff that writes it's data to a floppy
disk. I've never seen anything that uses a web interface.

> Fuck shitty pushbutton interfaces on the thermocycler itself.

We're going to have to have a few buttons and lights on the device
itself. The uC already has GPIO pins for this, so there's no reason
not to have them. I want something that can run standalone. The
computer is just going to upload programs to the cycler. Maybe it can
even start them or even run in a computer controlled mode. Certainly
it can get date from the cycler about the runs, etc.. But it needs to
at least have some basic functionality without being hooked up to a
computer.

> paste the sequence that I'm amplifying into a form
> on that webpage, paste in the sequences for the
> primers I'm using, and have the *thermocycler itself*
> compute the exact right temperature and time settings
> for this particular PCR run,

Nothing I know of does this automaticaly so there's no reason to
include this in the base design. Later on the interface program can
be improved to do this, but I see no reason to try and make the uC do
this.

I thank you for helping with the project, but I hope we can scale back
expectations, save future improvements for the future, and just get
something working. If we get too far ahead with feature ideas we're
never going to get the basics done. We haven't done anything at all
about nailing down basic specs and finding parts that will work and
already we're talking about integrated webservers running it. Maybe
I'm behind the curve, but it all seems like putting the cart in front
of the horse so to speak.


-Jake

Meredith L. Patterson

unread,
Apr 17, 2009, 8:04:04 PM4/17/09
to diy...@googlegroups.com
On Sat, Apr 18, 2009 at 1:52 AM, Josh Perfetto <jo...@snowrise.com> wrote:
> Here are the advantages I see of ethernet over USB:

I'd actually like to have both ethernet and USB, though for different
purposes: ethernet for accessing the control interface over the
network, USB for flashing the microcontroller with a firmware update.
We'll want to have an upgrade path available for the firmware, because
expecting software to be bug-free is foolish and there's always the
possibility of adding new functionality. It's possible to flash the
device over Ethernet, but I'd rather not enable that on a networked
device, for two reasons:

1. Security. It's much safer to download a firmware image to some
USB-enabled machine (e.g., your laptop) and upload it to a device that
you're in the same room as; you *don't* want some random person
flashing your device over the internet with potentially malicious
firmware.

2. Ease of use. If we use the Arduino bootloader, even if we're not
using a full Arduino board, the ability to flash new firmware onto the
device over USB is already there, for free.

Cheers,
--mlp

Meredith L. Patterson

unread,
Apr 17, 2009, 8:26:26 PM4/17/09
to diy...@googlegroups.com
On Sat, Apr 18, 2009 at 2:00 AM, Jake <jake...@mail.com> wrote:
> And how much is this going to cost? What does it add to the project?

No more than $20 in components, and some amount of programming time
from people like me.

> What does it add to the project?

Ease of use for people who aren't comfortable with the command line,
which is more biologists than you would expect. The PI on the project
I worked on last year at Berkeley couldn't find her way around a
command line even with me holding her hand the entire way; the only
thing she understood was web applications. Biologists are used to
things like NCBI's BLAST website.

> I know if you had your way you'd
> have a wireless thermocycler using cluster computing, bluetooth,
> paging you on your cellphone, sharing data with every other one built,
> interfacing to a central relational database, SQL, XML, fuzzy logic,
> etc., etc..

I can't think of good justifications for most of those "features", and
they'd be a fat pain in the ass to put on a cheap 8-bit
microcontroller. An ethernet controller with its own built-in TCP/IP
stack and a microcontroller that speaks USB are cheap and provide
important value-adds.

> To me that's just added complexity with no additional value. You
> can't load a sample remotely so there is no reason to start it
> remotely.

You can, however, *monitor* it remotely.

>> This means that, as Tito suggested, we can have a "Web 2.0 thermocycler".
>
> What's the cost? I have a hard time believing you can do that as
> cheaply as a simple serial interface. Every bit of scientific gear
> I've used either uses a simple serial interface or it uses a custom
> program. I've even used stuff that writes it's data to a floppy
> disk. I've never seen anything that uses a web interface.

Ever used a wireless router? They have web interfaces. Are they
scientific gear? Not in the strictest sense, but the principle is
*exactly the same*. This is a solved problem.

>> Fuck shitty pushbutton interfaces on the thermocycler itself.
>
> We're going to have to have a few buttons and lights on the device
> itself. The uC already has GPIO pins for this, so there's no reason
> not to have them. I want something that can run standalone. The
> computer is just going to upload programs to the cycler. Maybe it can
> even start them or even run in a computer controlled mode. Certainly
> it can get date from the cycler about the runs, etc.. But it needs to
> at least have some basic functionality without being hooked up to a
> computer.

What lab doesn't have a computer in it somewhere? What lab doesn't
have a wired network?

>> paste the sequence that I'm amplifying into a form
>> on that webpage, paste in the sequences for the
>> primers I'm using, and have the *thermocycler itself*
>> compute the exact right temperature and time settings
>> for this particular PCR run,
>
> Nothing I know of does this automaticaly so there's no reason to
> include this in the base design.

So, what, do you compute your temperature and time settings by hand
and program them in on the pushbutton interface on your thermocycler?
Isn't that a pain in the ass? Wouldn't you rather not have to do the
math?

> I thank you for helping with the project, but I hope we can scale back
> expectations, save future improvements for the future, and just get
> something working. If we get too far ahead with feature ideas we're
> never going to get the basics done. We haven't done anything at all
> about nailing down basic specs and finding parts that will work and
> already we're talking about integrated webservers running it. Maybe
> I'm behind the curve, but it all seems like putting the cart in front
> of the horse so to speak.

Jake, I understand your concerns, but believe me, I want a working DIY
thermocycler that doesn't suck just as much as you do. The reason I
don't want to get bogged down in outdated interfaces like RS232 is
because if we go that route, we're limiting ourselves unnecessarily
right out of the gate, whereas if we go with USB/ethernet, we're
providing an interface that will work with *any* existing computer.
Why a web interface? Both ease of use and ease of mainteance: the
other option is to provide drivers, like printer drivers, and
client-side data-reading applications for Windows, Mac and Linux, and
having to support three separate platforms is a giant pain in the ass.

I'm not pretending to be the world's expert on microcontroller
programming or anything, but I have done several microcontroller-based
projects that are about this complicated -- see
http://www.youtube.com/watch?v=bgfAJspEsUs for just one example, a
line-level meter using the Freescale microcontroller and LED array on
the DEFCON '07 badge.

As far as the basic specs go, Tito and others have been begging for
you working biologists to speak up about what the non-logic parts of a
thermocycler need to do. How about we quit arguing about this, and you
help sort out what the physical aspects of a thermocycler should be
like?

One thing I'd like to understand is gradient PCR, since I gather from
Josh and others that this would be a Good Thing to support. What's it
for, and what happens during a gradient PCR run?

Cheers,
--mlp

Josh Perfetto

unread,
Apr 17, 2009, 8:31:58 PM4/17/09
to DIYBio Mailing List, Jake
On 4/17/09 5:00 PM, "Jake" <jake...@mail.com> wrote:
> I'm a biologist, I don't give a damn how the thing communicates. Once
> the programming is done I don't care if I open a webpage, or start a
> program, or type the commands into a prompt. The damn thing could use
> morse code to communicate for all I care. It is abstracted from me.

If you are entering commands in Hyperterminal, the communication really
isn't abstracted from you.

> It's great that we actually have programmers here who want to make
> this project great. But don't bog down the actual functionality
> worrying about bells and whistles. I know if you had your way you'd
> have a wireless thermocycler using cluster computing, bluetooth,
> paging you on your cellphone, sharing data with every other one built,
> interfacing to a central relational database, SQL, XML, fuzzy logic,
> etc., etc..

I don't think anyone is really saying that. This is all about making things
easier and more productive for the biologist, not adding bells and whistles
for the sake of it.

> Every bit of scientific gear
> I've used either uses a simple serial interface or it uses a custom
> program. I've even used stuff that writes it's data to a floppy
> disk. I've never seen anything that uses a web interface.

That sounds awful.

> Nothing I know of does this automaticaly so there's no reason to
> include this in the base design. Later on the interface program can
> be improved to do this, but I see no reason to try and make the uC do
> this.

You are right - all the basic design needs to get right is the hardware
thermocycler and networking interface. That is enough to allow other people
to separately create these programs to do all this and more, which will
eventually make it even easier for the biologist. The beauty of this is
that it is entirely decoupled from this project - all we have to do is get
the interface right.

> I thank you for helping with the project, but I hope we can scale back
> expectations, save future improvements for the future, and just get
> something working. If we get too far ahead with feature ideas we're
> never going to get the basics done. We haven't done anything at all
> about nailing down basic specs and finding parts that will work and
> already we're talking about integrated webservers running it. Maybe
> I'm behind the curve, but it all seems like putting the cart in front
> of the horse so to speak.

You are right, we have to deal with the chassis, peltier junction plate
design, and get the hardware specs straightened out. If you know more about
this part you can probably help get this started off - I know more about
software and biology than physical hardware. Don't worry so much about the
communication and software details if that is not your forte - there are
many here that will get this part working with no problem.

-Josh


ben lipkowitz

unread,
Apr 17, 2009, 8:37:34 PM4/17/09
to DIYbio
On Fri, 17 Apr 2009, Jake wrote:

> can easily add an RJ-45 socket to the *board*, throw a small TCP/IP
> stack onto the microcontroller, set the microcontroller up to
>

> And how much is this going to cost? What does it add to the project?

it costs ... *drumroll* ... about $30 in buy-it-now Arduino
plug-together-like-lego format. much less in DIY solder-together format.

let's call this add-on interface module number 1

it adds a nice graphical interface, access from any computer in the lab,
and reduces time spent configuring communications protocols to zero. it
also completely removes the requirement for a shitty pushbutton interface.


>> This means that, as Tito suggested, we can have a "Web 2.0 thermocycler".
>
> What's the cost? I have a hard time believing you can do that as
> cheaply as a simple serial interface. Every bit of scientific gear
> I've used either uses a simple serial interface or it uses a custom
> program. I've even used stuff that writes it's data to a floppy
> disk. I've never seen anything that uses a web interface.

this is because you've only used closed-source proprietary scientific
gear. if scientists had access to the protocol specification, CAD files,
and firmware used to design the labware in the first place, you better
believe there would be plenty of 'mods' for that old spectrometer.

the design process of a group of users building devices for their own use
and is totally different from what goes on in most corporations. typically
they end up being much more flexible and modular.

the serial cable + adapter will probably end up costing about the same as
a web interface, unless you already have them, and aren't using them for
something else, or are really good at finding deals.

>> Fuck shitty pushbutton interfaces on the thermocycler itself.
>
> We're going to have to have a few buttons and lights on the device
> itself. The uC already has GPIO pins for this, so there's no reason
> not to have them. I want something that can run standalone. The
> computer is just going to upload programs to the cycler. Maybe it can
> even start them or even run in a computer controlled mode. Certainly
> it can get date from the cycler about the runs, etc.. But it needs to
> at least have some basic functionality without being hooked up to a
> computer.

stand-alone interface is a functionality that others may not need.

buttons and switches and shitty LCD screens end up costing more than you
expect, all said and done. expect a 4x20 character LCD to cost about $15,
and you need a board to mount the buttons on.

let's call this add-on interface module number 2

>> paste the sequence that I'm amplifying into a form
>> on that webpage, paste in the sequences for the
>> primers I'm using, and have the *thermocycler itself*
>> compute the exact right temperature and time settings
>> for this particular PCR run,
>
> Nothing I know of does this automaticaly so there's no reason to
> include this in the base design. Later on the interface program can
> be improved to do this, but I see no reason to try and make the uC do
> this.

what sort of calculations (and how many?) are required to compute
temperatures from a sequence? this seems like something that could be done
in javascript on the web page if it's too hard for the micro.

Meredith L. Patterson

unread,
Apr 17, 2009, 8:39:45 PM4/17/09
to diy...@googlegroups.com
On Sat, Apr 18, 2009 at 2:37 AM, ben lipkowitz <fe...@sdf.lonestar.org> wrote:
> what sort of calculations (and how many?) are required to compute
> temperatures from a sequence? this seems like something that could be done
> in javascript on the web page if it's too hard for the micro.

Computing it in Javascript was in fact exactly my plan -- thanks for
reading my mind. :)

Cheers,
--mlp

Cory Tobin

unread,
Apr 17, 2009, 8:57:59 PM4/17/09
to diy...@googlegroups.com
> One thing I'd like to understand is gradient PCR, since I gather from
> Josh and others that this would be a Good Thing to support. What's it
> for, and what happens during a gradient PCR run?

In a typical PCR run you cycle between 3 different temperatures.
First you go to ~96C to melt the DNA into single strands so the
primers have something to bind to. 2nd, you go to the annealing
temperature where the primers bind to the template DNA, somewhere
between 50C and 65C. 3rd, you go to the elongation temperature ( ~72C
) where the polymerase works most efficiently.

To get the best results you have to tune the annealing temperature to
the specific primers you are using. If you don't know what the
optimal temperature is then you can use temperature gradient PCR to do
the reaction at a bunch of different temperatures all at the same
time. Each well differs by 1 degree or so. Then you run the products
out on a gel and see what temperature worked best.

As far as I know, the algorithms for calculating the best temperature
are pretty spot on. I've done hundreds of PCR runs and never had a
problem with annealing temperatures. I base my annealing temperature
on Primer3's calculated annealing temperature (
http://frodo.wi.mit.edu/ ) and it always seems to work. One situation
where a gradient PCR machine might come in handy is if you have a
bunch of different primer pairs that have a wide range of annealing
temperatures and you want to run them all at the same time on the same
machine. To get around this I just try to make all my primers anneal
around 60C, give or take a degree, and then run the PCR at 60C.

IMHO, if including this feature is going to significantly raise the
price of the machine then I wouldn't worry about it. Otherwise I
suppose it might come in handy occasionally.


-Cory

Josh Perfetto

unread,
Apr 17, 2009, 9:01:59 PM4/17/09
to DIYBio Mailing List, Meredith L. Patterson
On 4/17/09 5:26 PM, "Meredith L. Patterson" <clon...@gmail.com> wrote:
> One thing I'd like to understand is gradient PCR, since I gather from
> Josh and others that this would be a Good Thing to support. What's it
> for, and what happens during a gradient PCR run?

Say you are trying to amplify a specific fragment of DNA. You use this
great web application :) to calculate all the parameters like annealing
temperature, do the run, and the run it on a gel to examine the result.
Unfortunately you discover two bands, meaning that more than the specific
fragment you wanted were amplified in the reaction. Sometimes this is ok,
but for many uses of PCR it isn't, and increasing PCR specificity is one of
the major headaches of PCR.

Now the calculation models have let you down - it gave you a great starting
place, but to make the reaction more specific, you're going to have to
empirically test and optimize the reaction parameters. This sucks because
it is a multivariate problem with the optimizing the concentration of your
primers, Taq, template, adjusting number of cycles, adjusting annealing
temperature, etc. The good news is that the annealing temperature parameter
tends to have the largest impact.

So now we have to do another run, and this time we will empirically test a
range of annealing temperatures of say 55-68C at 1C intervals. Then we will
run the result on a gel, and we'll see what temperature is the most specific
(ideally only one sharp band).

To do this, the thermocycler has to be able to create different annealing
temperatures in different wells during the same run. Usually this is done
in rows, so each row has an identical temperature. I'm not sure how the
commercial machines capable of gradient PCR do this (i.e. How many different
peltier junctions are involved), but they create this temperature gradient
across the wells.

-Josh


Bryan Bishop

unread,
Apr 17, 2009, 9:05:56 PM4/17/09
to diy...@googlegroups.com, kan...@gmail.com
On Fri, Apr 17, 2009 at 8:01 PM, Josh Perfetto <jo...@snowrise.com> wrote:
> On 4/17/09 5:26 PM, "Meredith L. Patterson" <clon...@gmail.com> wrote:
>> One thing I'd like to understand is gradient PCR, since I gather from
>> Josh and others that this would be a Good Thing to support. What's it
>> for, and what happens during a gradient PCR run?
>
> Now the calculation models have let you down - it gave you a great starting
> place, but to make the reaction more specific, you're going to have to
> empirically test and optimize the reaction parameters.  This sucks because
> it is a multivariate problem with the optimizing the concentration of your
> primers, Taq, template, adjusting number of cycles, adjusting annealing
> temperature, etc.  The good news is that the annealing temperature parameter
> tends to have the largest impact.

On this note, from a logistical perspective, I have always found it
somewhat annoying to have to sit down by hand and chart out an
algorithm or plan of action for the optimization of some multivariable
problem thingy. I mean, that's science for you, no doubt, but it just
kind of sucks to get everything rolling and some organizational scheme
that you can fit into your poor head. So, I'm going to be sure to
write some more software to help solve this sort of problem in the
future .. maybe something that helps to lay out the plan of action, or
something, and then can set up programs to run on the thermocycler for
that sort of optimization problem. Gah, this brings back some terrible
memories of PCR optimization back when I was doing some reverse
engineering work on Winfree's DNA stuff. It gets nasty.

Eric Zhang

unread,
Apr 17, 2009, 9:10:59 PM4/17/09
to diy...@googlegroups.com
Gradient PCR is done when you're having trouble with a particular template / primer mix.  Instead of annealing at the same temperature for the entire run, you set an annealing temperature gradient (temperature range).  This way, you don't have to do multiple runs, potentially wasting valuable template DNA, to get your product.

Annealing is the step in PCR when your primer sits down on the template DNA before polymerase joins in and starts grabbing dNTPs to catalyze the reaction.  The optimal temperature for annealing depends on the nucleotide content (GC content) of your complementary region that you created when you designed your primers.

Eric Zhang

unread,
Apr 17, 2009, 9:13:40 PM4/17/09
to diy...@googlegroups.com
Actually what you're saving is time when you do gradient PCR.  You end up running multiple PCRs with the same 'master mix.'

ben lipkowitz

unread,
Apr 17, 2009, 10:43:53 PM4/17/09
to DIYbio
On Sat, 18 Apr 2009, ben lipkowitz wrote:
> it adds a nice graphical interface, access from any computer in the lab,
> and reduces time spent configuring communications protocols to zero. it
> also completely removes the requirement for a shitty pushbutton interface.
>
> let's call this add-on interface module number 1

> buttons and switches and shitty LCD screens end up costing more than you


> expect, all said and done. expect a 4x20 character LCD to cost about $15,
> and you need a board to mount the buttons on.
>
> let's call this add-on interface module number 2

(I'm linking to ekitszone simply because it's the first site I found which
had schematics)

using diecimila as the base
http://www.ekitszone.com/index.php?main_page=product_info&cPath=1&products_id=1

and assuming this enc28j60 module for add-on number 1:
http://www.ekitszone.com/index.php?main_page=product_info&cPath=1&products_id=3
which uses PD2 PB2 PB3 PB4 PB5

and this HD44780 module for add-on number 2:
http://www.ekitszone.com/index.php?main_page=product_info&products_id=2
which uses PB0 PB1 PD4 PD5 PD6 PD7

it should be possible to stack them so you can have both at once right?

or use neither and run the arduino solely over USB.

as long as the pins for each module aren't used for something else, you
shouldn't even need a different software configuration. that leaves 9 pins
exposed (all of port C, PD0, PD1, and PD3) for turning on peltier devices
and reading temperature sensors.

is this a good plan? am I missing anything obvious? it seems too easy.

-fenn

Josh Perfetto

unread,
Apr 18, 2009, 5:03:26 AM4/18/09
to DIYBio Mailing List
I was just doing some research on peltier devices, and how the thermal blocks on commercial devices work including gradient PCR.  It looks that in many devices the underside of the metal block where the wells are is only partly covered by peltier devices.  The gradient machines are basically doing strategic placement of 6-8 peltier devices so you can get more than 6-8 different temperatures in the wells, thus conserving peltier devices.

I also saw that you can get 50W 4x4cm peltier plates from bulk sellers on eBay for $6.30.  This seems to be a very common (== cheap) size because people are using them to cool overclocked CPUs.  I’m not yet sure if these devices would work well for PCR (can anyone speak to this?), but here’s an idea if they do.

Suppose we made a design based on having 6 of these plates under the control of the same Arduino board.  If some folks wanted to save costs and only needed a few wells, they could choose to only install 1 of the plates.  If other folks want to optimize PCR reactions, they can have 6 separate temperatures.  That’s not as many as the gradient machines, but it is more flexible in that the temperatures can be totally different.  This would be a simpler design than trying to create temperature gradients over a large heat block.

The downside is that now the device needs to have a 300W+ power supply which would be overkill for 1 plate, though not sure if 300W would cost a whole deal more than 50W.  You also have to think about cooling.  I don’t know if this would work – thermoelectric engineering really isn’t my thing, but could you put a heat sink below each of the installed plates, and then have a sizeable fan moving air over these heat sinks?

BTW I saw that many commercial thermocycler machines are now shipping with Ethernet.  Guess they beat us to it :(

-Josh

Josh Perfetto

unread,
Apr 18, 2009, 5:24:27 AM4/18/09
to DIYBio Mailing List, ben lipkowitz
On 4/17/09 7:43 PM, "ben lipkowitz" <fe...@sdf.lonestar.org> wrote:

> it should be possible to stack them so you can have both at once right?
>
> or use neither and run the arduino solely over USB.

If you want to support USB-only mode in addition to Ethernet, then you will
have to either run TCP/IP over USB (probably the easiest solution), or
develop a second software stack to control the microcontroller over USB +
controller software for Mac, PC, ugh.

-Josh


Thomas Knight

unread,
Apr 18, 2009, 10:22:54 AM4/18/09
to diy...@googlegroups.com, Thomas Knight
The major problem with thermoelectric coolers is premature failure due
to thermal mismatch. As they heat and cool, the materials of the
coolers expand and contract at different rates, leading to fracture.
I believe there may be issues with off-the-shelf CPU cooling modules
when used in the rapid cycling environment of a thermocycler. If you
look at sites such as Melcor, you will find they spec versions for
cyclic use.

JonathanCline

unread,
Apr 18, 2009, 10:57:41 AM4/18/09
to DIYbio
On Apr 18, 9:22 am, Thomas Knight <t...@csail.mit.edu> wrote:
> The major problem with thermoelectric coolers is premature failure due  
> to thermal mismatch.  As they heat and cool, the materials of the  
> coolers expand and contract at different rates, leading to fracture.  
> I believe there may be issues with off-the-shelf CPU cooling modules  
> when used in the rapid cycling environment of a thermocycler.  If you  
> look at sites such as Melcor, you will find they spec versions for  
> cyclic use.


Heating & cooling is highly dependent on liquid volume. With small
enough volumes, active cooling is not necessary, as long as room
temperature air is circulated.

What is the liquid volume that is desired?

The answer to that question may allow significant improvements to the
mechanical design.


## Jonathan Cline
## jcl...@ieee.org
## Mobile: +1-805-617-0223
########################

Nathan McCorkle

unread,
Apr 18, 2009, 1:05:32 PM4/18/09
to diy...@googlegroups.com
I was personally thinking that for a gradient design, we could have 4 or 6 separate small blocks in the thermocycler housing, each with it's own peltier interface(s)... that way there would be some insulation between each aluminum block rather than some weird conduction effects happening if you tried having gradients in a larger block.

Cracking the peltier junction sounds awful... I but I also would like the PCR process to be fairly quick.... if a cycle is 2.5 minutes, I wouldn't want it to take much longer than 3 minutes including temp changes. Though I don't know how quick commercial products are at this.

I may be wrong, but with a peltier device, can you change the the direction of heating by just changing the direction of the current? I.E. the heating side changes to the cooling side, and vice versa? Maybe we could avoid some stress by having the junction on, say, the bottom and top of the block (lid) be the heaters, and have a separate junction that it just for cooling.... so each block would have 3 peltier junctions, maybe the cooler junction could be rigged up to some copper tubing filled with antifreeze or comparable fluid, coiling around it's block.

That might not be any better of an idea, but, if it could possibly decrease the stress on the peltier junction, for the extra engineering/time and investment in parts/money... I think it would be worth it.
--
Nathan McCorkle
Rochester Institute of Technology
College of Science, Biotechnology/Bioinformatics

Jason Morrison

unread,
Apr 18, 2009, 1:13:08 PM4/18/09
to diy...@googlegroups.com
> I may be wrong, but with a peltier device, can you change the the direction
> of heating by just changing the direction of the current?

Yep - see http://www.fujitaka.com/pub/peltier/english/thermoelectric_cooling.html
or http://www.patentstorm.us/patents/6748747/description.html

--
Jason Morrison
jason.p....@gmail.com
http://jayunit.net
(585) 216-5657

Eric Zhang

unread,
Apr 18, 2009, 5:39:35 PM4/18/09
to diy...@googlegroups.com
Re: desired volume, pcr reactions generally run 10-30 uL per tube.
That's in microliter, or 1/1000 mL.

Josh Perfetto

unread,
Apr 18, 2009, 6:30:04 PM4/18/09
to DIYBio Mailing List, JonathanCline
A typical volume would be 50 ul. I don't think it's just about liquid
volume though - you also have to heat and cool the metal heat block that
holds the PCR tubes.

-Josh

Josh Perfetto

unread,
Apr 18, 2009, 7:55:01 PM4/18/09
to DIYBio Mailing List
Here are some rough calculations to get this started. The Melcor company
Tom mentioned has a 4cm x 4cm 51.4W peltier device they say is ideal for
"laboratory apparatus". This is CP1.4-127-06L and the 51.4W refers to the
heat absorption on the cooling side of the junction.

Say we put a 4cm x 4cm x 0.6cm piece of aluminum on top of this, in which
the wells were drilled (is 0.6cm enough depth?). Here's some very rough
calculations, ignoring the wells for now (heat capacity of water is not all
that different than aluminum, the water volume of the reaction may not fill
up the well, there would be some plastic in there):

Volume = 9.6E-6 m^3
Mass = 9.6E-6 m^3 * 2710 kg/m^3 = 0.026 kg
Heat capacity = 0.026 kg * 900 J/kgK (specific heat @ 20C) = 23.4 J/K
Time to reduce temp 1C = 23.4 J/K / 51.4 J/s = 0.46 s/K

..which seems in the ballpark. In addition to wells this calculation didn't
take into account heat loss or thermal conductivity of the aluminum, and so
it is really a lower bound - the actual time would be higher. Actually if
thermal conductivity was the limiting factor, a smaller wattage peltier
device might be better. They also have 33.4W and 72W versions at the same
size.

-Josh


ben lipkowitz

unread,
Apr 19, 2009, 2:56:55 AM4/19/09
to DIYbio
On Sat, 18 Apr 2009, JonathanCline wrote:

> On Apr 18, 9:22 am, Thomas Knight <t...@csail.mit.edu> wrote:
>> The major problem with thermoelectric coolers is premature failure due  
>> to thermal mismatch.  As they heat and cool, the materials of the  
>> coolers expand and contract at different rates, leading to fracture.  
>> I believe there may be issues with off-the-shelf CPU cooling modules  
>> when used in the rapid cycling environment of a thermocycler.  If you  
>> look at sites such as Melcor, you will find they spec versions for  
>> cyclic use.
>
>
> Heating & cooling is highly dependent on liquid volume. With small
> enough volumes, active cooling is not necessary, as long as room
> temperature air is circulated.
>
> What is the liquid volume that is desired?
>
> The answer to that question may allow significant improvements to the
> mechanical design.

I agree with you overall, in that getting rid of the peltier is desirable.
The total heat flux (in watts) is the important parameter. A heatsink with
more surface area will get rid of more heat faster. Running a peltier
simply adds more heat to get rid of.

A heatsink with more surface area will usually also have a higher heat
capacity (making for a slower rise time)

You have some control over the amount of heat loss by turning the fan on
and off.

I am picturing a fat-bottom CPU heatsink like this, with holes drilled in
it for the tubes:
http://www1.istockphoto.com/file_thumbview_approve/2994432/2/istockphoto_2994432_cpu_heatsink.jpg

then, you glue power resistors all around on the sides. if the
temperature is too inconsistent from edge to center you could chop it up
into single rows.

how long would it take for a CPU heatsink with just a fan to cool off from
90 to 50 degrees celsius?

let's assume the heatsink is 8cm wide and has a 1cm thick base plate. add
another 0.5cm thickness for the fins. thus the mass is 8cm*8cm*1.5cm*2.7g/cc
= 259g


from newton's law of cooling:
T(t) = Ta + (T0-Ta)*exp(-r*t)

so how to find 'r'?

well, we know the energy going through the heatsink:
specific heat capacity of aluminum is 0.9J/(g*degC)

259g*(90degC-25degC)*.9J/(g*degC)
= 15.1 kJ goes out of the heatsink if allowed to cool forever

"A heat sink with a 0.5 degree per watt rating is actually a reasonably
good heat sink. That means a CPU that's drawing 70 watts will be running
at 35C above ambient."

so, a typical CPU heatsink has a thermal junction to ambient of about
0.5degC/watt

our mysterious 'r' constant must have units of 1/s

I couldn't find any examples of how to calculate 'r' from real engineering
values, so this is what I came up with by randomly plugging numbers into
my units-aware calculator: (don't try this at work, kids)
(90degC-25degC)/(15.1kJ*0.5degC/watt) = 0.0086092715/s

as i'm led to believe by various math textbook story problems, this value
is significantly less than the cooling capacity of a freshly baked cake,
which doesn't seem quite right, but nevertheless, forging onward:

T(t) = tempC(25) + (90degC-25degC)*exp(-0.0086092715/s*111s) = 49.99 C

so, according to these calculations which are probably wrong, it will take
about two minutes to cool from 90C to 50C

how long would it take to heat up?
that's an easy calculation:
assuming we can dump 300 watts into the power resistors, and assuming no
heat loss (the fan is off)
(40degC*259g*0.9J/(g*degC))/300W = 31.08 seconds to heat up

this is an average temperature ramp of 1.28deg/s up and 0.36deg/s down,
which kinda sucks compared to the 10degC/s slew rate of an Idaho
Technology RapidCycler2 <http://www.idahotech.com/RapidCycler2/index.html>
which is essentially a light bulb and a fan, and you put your PCR mix in
glass capillary tubes around the light bulb.

random googling shows the typical slew rate of a rt-pcr machine is 10C/s
up and 2.5C/s down, so either they are using way higher wattage for the
heaters or much less aluminum in the block. note that the ratio of heating
to cooling speed is about the same. what is the typical block thickness?
is it a solid slab of aluminum? does anyone have pictures of a
disassembled PCR machine they'd like to share?

-fenn

Cory Tobin

unread,
Apr 19, 2009, 3:46:44 AM4/19/09
to diy...@googlegroups.com
> random googling shows the typical slew rate of a rt-pcr machine is 10C/s
> up and 2.5C/s down, so either they are using way higher wattage for the
> heaters or much less aluminum in the block. note that the ratio of heating
> to cooling speed is about the same. what is the typical block thickness?
> is it a solid slab of aluminum? does anyone have pictures of a
> disassembled PCR machine they'd like to share?


You are right. The thermocyclers that I have used were not solid
blocks of metal with holes drilled in them. They have small aluminum
tubes, 1 for each PCR rube, just wide enough to surround the PCR tube.
It's sort of difficult to explain in words. Maybe you can see what I
am talking about in one of these pictures...

http://bioweb.wku.edu/courses/Biol121/Genetics/thermocycler.png
http://farm2.static.flickr.com/1360/881269960_8cd40dda94.jpg?v=0

If not, I can take a close-up photo of my thermocycler on Monday.


-Cory

ben lipkowitz

unread,
Apr 19, 2009, 6:06:02 AM4/19/09
to diy...@googlegroups.com
On Sun, 19 Apr 2009, Cory Tobin wrote:

> You are right. The thermocyclers that I have used were not solid
> blocks of metal with holes drilled in them. They have small aluminum
> tubes, 1 for each PCR rube, just wide enough to surround the PCR tube.

Now you got me thinking about what exactly the block does. It's supposed
to evenly spread the heat from peltiers across the tubes. But if we're not
using peltiers, then why bother with a slab of aluminum at all?

Why not just suspend the tubes in a wire rack and blow heated air around
inside an insulated chamber? Seems to me that the temperature would get
evenly mixed simply by circulating around the chamber a few times. And air
has a much lower heat capacity than aluminum, so response times would be
near-instant..

ben lipkowitz

unread,
Apr 19, 2009, 8:37:46 AM4/19/09
to diy...@googlegroups.com
On Sun, 19 Apr 2009, ben lipkowitz wrote:

> Why not just suspend the tubes in a wire rack and blow heated air around
> inside an insulated chamber? Seems to me that the temperature would get
> evenly mixed simply by circulating around the chamber a few times. And air
> has a much lower heat capacity than aluminum, so response times would be
> near-instant..

responding to my own post, so sad.

couldn't sleep, so I made a sketch of an idea I had for a circular
air-circulating PCR machine:

http://imagebin.org/46068

the yellow thing in the middle is a light bulb; underneath that is a fan.
maybe an axial fan would be better or easier to find, but harder to draw.
yes, it's a small light bulb. less mass to heat up means less wattage.

the whole shebang would be located in an airtight can of some sort, and
the top/bottom would lift up under the control of a solenoid so as to
admit large amounts of cooling air. or perhaps instead of a solenoid there
could be a radial damper like on a barbecue with an R/C servo motor.

one of the mini tubes could have a temperature sensor glued in it.

in addition to being faster, the nice thing about this is you don't have
to make an aluminum block with precise holes in it.

Nathan McCorkle

unread,
Apr 19, 2009, 4:23:56 PM4/19/09
to diy...@googlegroups.com
well that last design is basically the $25 PCR machine (google it)... the only thing I don't like is that the light could affect the PCR reaction with some chems, possibly (not sure I guess, but I think it would be better to have a dark reaction chamber.


I am up for a non-block arrangement, but I also don't want this to be made into a coffee can or something like that. I want something that if I pack it in my vehicle, it's not going to rattle apart and fall to pieces from slight vibrations.

I saw a thing on Discovery channel Friday night where they were making hot tubs, they had their mould and a flat sheet of plastic, the heated the plastic sheet til it started to sag, then placed it over the mould and applied vacuum underneath, to create a formed plastic insert. We could do the same thing for the housing of the machine, for a totally custom machine!

JonathanCline

unread,
Apr 19, 2009, 6:01:49 PM4/19/09
to DIYbio
On Apr 19, 5:06 am, ben lipkowitz <f...@sdf.lonestar.org> wrote:
> On Sun, 19 Apr 2009, Cory Tobin wrote:
> > You are right.  The thermocyclers that I have used were not solid
> > blocks of metal with holes drilled in them.  They have small aluminum
> > tubes, 1 for each PCR rube, just wide enough to surround the PCR tube.
>
> Now you got me thinking about what exactly the block does. It's supposed
> to evenly spread the heat from peltiers across the tubes. But if we're not
> using peltiers, then why bother with a slab of aluminum at all?
>
> Why not just suspend the tubes in a wire rack and blow heated air around
> inside an insulated chamber?


Or use a capillary, draw the solution into the capillary, where the
solution is moved between heated capillary sections ("temperature
regions") suspended in some substrate with higher thermal conductivity
than air, such as mineral oil.


The important design consideration here is that a new thermocycler is
not a task of "reverse engineer old or current equipment and replicate
it cheaply". It is a re-engineering the needed features in perhaps
a new way.

Tito Jankowski

unread,
Apr 19, 2009, 6:10:18 PM4/19/09
to diy...@googlegroups.com
For instance, what of we wanted to make a "species identifier" device.
Take a sample of a leaf and tell what species of plant it is - this
would involve gel electrophoresis and PCR, as well as a camera to
analyze the gel. What would be the spec for that thermal cycler?

DIYbio is about complementing and extending academic uses of bio.

Sent from my iPhone

Josh Perfetto

unread,
Apr 19, 2009, 6:28:42 PM4/19/09
to DIYBio Mailing List
I’m up for non-peltier block designs if the outcome is a better machine than commercial peltier thermocyclers, but otherwise my goal is a quality machine for as cheap as possible, not just something better than water baths for as cheap as possible (I think as Cory said there may be multiple visions of this).

I wouldn’t use the $25 thermocycler myself because:

  1. With cooling rates of 0.25C/s, it is like 8 times slower than commercial devices
  2. It seems there will be much more temperature variability between the tubes.  It works by putting a temperature sensor in one tube, and regulating the chamber temperature based on that tube.  If the volume of all the tubes was exactly the same, this may be work, but now slight differences in volume will cause differences in time each tube will spend at each temp since different tubes will arrive at that temp at different times.  This is something the aluminum block avoids as it’s a much better conductor of heat than air.

I don’t mean to discourage development of solutions like this since it may meet the needs of many people and is certainly a welcome step up if if you are doing water baths (You can also get the design, and download the control source code for this here http://www3.interscience.wiley.com/cgi-bin/fulltext/113449444/HTMLSTART rather than re-creating this from scratch).  But I would really like to see (and work on myself) is a thermalcycler that is cheaper AND better than commerical devices.  What I really like about the open gel box that Tito & Norman worked on is that even though I already have a commercial gel box, I might prefer to use the open gel box, because it can do things my commercial gel box can’t (visualize bands while the gel is running).

-Josh

JonathanCline

unread,
Apr 19, 2009, 6:32:33 PM4/19/09
to DIYbio
On Apr 19, 5:10 pm, Tito Jankowski <titojankow...@gmail.com> wrote:
> For instance, what of we wanted to make a "species identifier" device.  
> Take a sample of a leaf and tell what species of plant it is - this  
> would involve gel electrophoresis and PCR, as well as a camera to  
> analyze the gel. What would be the spec for that thermal cycler?
>
> DIYbio is about complementing and extending academic uses of bio.


I would guess that involves multiple-gradient PCR's (since the
protocol is likely unknown)? Previous discussion explained gradient
PCR as varying a single variable of the entire system. It is
certainly possible to scale this to N variables, if each microreactor
is independent. Which is doable if the microreactor is under precise
control. I wrote up this idea last month though it doesn't seem to be
that useful in the real world yet (since your "take a sample, run PCR,
run to a gel, take a picture" machine doesn't exist yet either).

The idea goes like this: Insert single sample into machine. It is
automatically distributed to N microreactors. Each microreactor
undergoes a multi-variable gradient PCR (varying several parameters at
once, across a min/max resolution: annealing temperature, annealing
time, etc). Each microreactor is monitored for PCR completion
(variation on RT PCR). The user discards the output of all
microreactors except the optimally-PCR'ed microreactor, which has
shown the best product. Microfluidics may provide this kind of
scalability which allows N microreactors to operate/sense
independently.


I heard an interesting application for such a device like you
describe, though, when it eventually gets built. It seems that
physical anthropologists could use such a device for all the bones and
such they dig up. They aren't biologists -- so they may not know
anything about wet lab -- yet they would benefit greatly from
biological tools to provide precise answers to the problems in their
field.

Josh Perfetto

unread,
Apr 19, 2009, 8:16:16 PM4/19/09
to DIYBio Mailing List, JonathanCline
On 4/19/09 3:32 PM, "JonathanCline" <jnc...@gmail.com> wrote:

> The idea goes like this: Insert single sample into machine. It is
> automatically distributed to N microreactors. Each microreactor
> undergoes a multi-variable gradient PCR (varying several parameters at
> once, across a min/max resolution: annealing temperature, annealing
> time, etc). Each microreactor is monitored for PCR completion
> (variation on RT PCR). The user discards the output of all
> microreactors except the optimally-PCR'ed microreactor, which has
> shown the best product. Microfluidics may provide this kind of
> scalability which allows N microreactors to operate/sense
> independently.


This would be cool in that you could have a self-optimizing PCR machine.
There are probably many more uses as well not yet imagined. For PCR
optimization I don't know how you'd do this with RT-PCR alone (since you are
trying to optimize for a specific product, not just maximal amount of any
product), but you could do this if you integrate a gel run into your system.
You also do not need to be limited by doing the optimization in a single-run
(which would waste valuable reagents), but you could do a series of runs to
use as efficient an optimization strategy as possible.

I think that we could work towards something this in stages. You could
start with:

1. A networked PCR device (possibly RT-PCR) capable of independently
controlling some groups of wells.
2. A networked microfluidics device that can fill plastic well plates/strips
with different mixtures of fluids
3. A networked gel electrophoresis reader (Tito and I were talking about
possible modifications of Keiki gels to do this)

You can then have these devices all under the control of a single web
application, which can be managing the PCR optimization strategy (doing the
first run, looking at the results, determining what next to change to do the
multivariate optimization, etc.), or managing the process for whatever else
the goal is. This shows the power of networked devices, and makes it easier
to accomplish common tasks.

You could start off by manually shuttling plates between these devices, and
later automate this either with robotics, or with an integrated device using
microfluidics to transport fluids between the components. This could be
either custom devices (like an automated species identifier), or a general
purpose device. Of course, the other approach to this is lab on a chip, and
there are some papers on doing PCR on microfluidics chips. You could also
look at something like a switched microfluidics LAN to transport fluids
between devices. Obviously this robotics or microfluidics integration is
quite advanced, which is why I think we should start with simpler,
independent, networked devices, and go forward from there.

-Josh


Bryan Bishop

unread,
Apr 19, 2009, 8:46:18 PM4/19/09
to diy...@googlegroups.com, kan...@gmail.com
On Sun, Apr 19, 2009 at 7:16 PM, Josh Perfetto <jo...@snowrise.com> wrote:
> On 4/19/09 3:32 PM, "JonathanCline" <jnc...@gmail.com> wrote:
>> The idea goes like this:   Insert single sample into machine.  It is
>> automatically distributed to N microreactors.  Each microreactor
>> undergoes a multi-variable gradient PCR (varying several parameters at
>> once, across a min/max resolution: annealing temperature, annealing
>> time, etc).  Each microreactor is monitored for PCR completion
>> (variation on RT PCR).    The user discards the output of all
>> microreactors except the optimally-PCR'ed microreactor, which has
>> shown the best product.   Microfluidics may provide this kind of
>> scalability which allows N microreactors to operate/sense
>> independently.
>
> I think that we could work towards something this in stages.  You could
> start with:
>
> 1. A networked PCR device (possibly RT-PCR) capable of independently
> controlling some groups of wells.
> 2. A networked microfluidics device that can fill plastic well plates/strips
> with different mixtures of fluids
> 3. A networked gel electrophoresis reader (Tito and I were talking about
> possible modifications of Keiki gels to do this)
>
> You can then have these devices all under the control of a single web
> application, which can be managing the PCR optimization strategy (doing the
> first run, looking at the results, determining what next to change to do the
> multivariate optimization, etc.), or managing the process for whatever else
> the goal is.  This shows the power of networked devices, and makes it easier
> to accomplish common tasks.
>
> You could start off by manually shuttling plates between these devices, and
> later automate this either with robotics, or with an integrated device using
> microfluidics to transport fluids between the components.  This could be
> either custom devices (like an automated species identifier), or a general
> purpose device.  Of course, the other approach to this is lab on a chip, and
> there are some papers on doing PCR on microfluidics chips.  You could also
> look at something like a switched microfluidics LAN to transport fluids
> between devices.  Obviously this robotics or microfluidics integration is
> quite advanced, which is why I think we should start with simpler,
> independent, networked devices, and go forward from there.

So, it's good that we're thinking about the same things here. By the
way- there have been examples in the literature of microfluidic
devices that implement something equivalent to gel electrophoresis,
and sometimes even mass spectrometry, without the use of gels.

I like the idea of integrating it all on a lab on a chip (eventually-
separate components need to be built at a time, and such), but the one
problem I see with it is that you can't grab resultants after
individual reactions for storage or debugging. Some sort of networking
into waiting containers would be a nice fix, although not if it
substantially subtracts from analyte mass, which would be a detriment
to the experiment, etc.

What do you mean by a switched microfluidics LAN system for fluid
transport? Do you mean tubes going all over the place between devices,
i.e., larger than lab on a chip? Where'd this idea come from? :-)

The software that I'm working on is specifically aiming towards the
automatic generation of instructions for moving stuff from one machine
to the next for different protocols-- either by hand, or in an
integrated device (by hand comes first though). These instructions are
just really really elaborate and nice protocols without the pain of
writing detailed protocols. In a sense, it's kind of like CAM.

ben lipkowitz

unread,
Apr 19, 2009, 9:27:03 PM4/19/09
to DIYBio Mailing List
On Sun, 19 Apr 2009, Josh Perfetto wrote:

> I wouldnšt use the $25 thermocycler myself because:
>
> 1. With cooling rates of 0.25C/s, it is like 8 times slower than commercial
> devices

The "$25 thermocycler" didnt include a fan. It was just a light bulb. I
think it was open-air so they could have convection going through, which
would reduce the heating rate too. What I'm suggesting is to open a hole
which lets cool air through. I think this will heat and cool faster than
any reasonably DIY aluminum block design. I will do some experimentation
if I manage to get away from the computer for a while.

> 2. It seems there will be much more temperature variability between the


> tubes. It works by putting a temperature sensor in one tube, and regulating
> the chamber temperature based on that tube. If the volume of all the tubes
> was exactly the same, this may be work, but now slight differences in volume
> will cause differences in time each tube will spend at each temp since
> different tubes will arrive at that temp at different times. This is

> something the aluminum block avoids as itšs a much better conductor of heat
> than air.

Why would you be running multiple reactions with different volumes? Do you
actually do this on a regular basis?

I think the small thermal mass of the tube means that it will spend much
more time at the desired temperature than when changing temperatures.

The other way to go about it is to use a programmable limited temperature
ramp rate, which allows all the tubes time to equalize during temperature
changes. It seems to me that this would increase nonspecific binding
though.

> I donšt mean to discourage development of solutions like this since it

> may meet the needs of many people and is certainly a welcome step up if
> if you are doing water baths

I came up with this design because of the limitations of homemade aluminum
blocks, not as a 'bare minimum' design. It looks like the commercial
blocks are intricately machined in order to reduce thermal mass. I don't
see how we could do this without a CNC milling machine or excessive cost
to get it made professionally.

> (You can also get the design, and download the control source code for
> this here
> http://www3.interscience.wiley.com/cgi-bin/fulltext/113449444/HTMLSTART
> rather than re-creating this from scratch).

where is the code? (not like it matters, I wouldn't use a PC for this anyway)

not to be a jerk but the design documentation is really bad. it would be
hard to exactly replicate what they did since there is so much left
unspecified, and a fair amount of re(verse)-engineering.

-fenn

Josh Perfetto

unread,
Apr 19, 2009, 10:06:39 PM4/19/09
to DIYBio Mailing List, Bryan Bishop
On 4/19/09 5:46 PM, "Bryan Bishop" <kan...@gmail.com> wrote:
> So, it's good that we're thinking about the same things here. By the
> way- there have been examples in the literature of microfluidic
> devices that implement something equivalent to gel electrophoresis,
> and sometimes even mass spectrometry, without the use of gels.

Yeah I need to look at this some more, there would be a lot of advantages
with automated gel runs or their equivalent.

> What do you mean by a switched microfluidics LAN system for fluid
> transport? Do you mean tubes going all over the place between devices,
> i.e., larger than lab on a chip? Where'd this idea come from? :-)

I was initially thinking about the question Tito asked about how to create a
species identification device, and then about lab automation in general.
One approach is lab on a chip, and if you actually do have a chip that has
all that you need, this would be the better approach. If you can't do it on
a single chip though, and you need to use two different devices (or a
microfluidics chip and other external device, or 2 microfluidics chips), you
might be able to connect these devices with capillary tubes and a control
protocol. Under control of the same computer, device A sends an
identifiable spacer liquid followed by the liquid of interest followed by
another spacer liquid over a capillary tube, and device B identifies the
liquid of interest between the two spacers, and redirects it as appropriate
when it comes in.

> The software that I'm working on is specifically aiming towards the
> automatic generation of instructions for moving stuff from one machine
> to the next for different protocols-- either by hand, or in an
> integrated device (by hand comes first though). These instructions are
> just really really elaborate and nice protocols without the pain of
> writing detailed protocols. In a sense, it's kind of like CAM.

I was thinking about the software you mentioned when I wrote that. I think
the wise progression is:

1. Such software tells the biologist what to do
2. Such software tells the networked devices what to do (i.e. Themocycler)
and tells the biologist when to move the materials
3. Such software directs devices (either on a single chip or integrated in
some manner) to control the whole process.

-Josh


Bryan Bishop

unread,
Apr 19, 2009, 10:10:50 PM4/19/09
to Josh Perfetto, diy...@googlegroups.com, kan...@gmail.com

Yes, I agree with all of that. But here's an oddball, what about the
ability to print out a custom lab on a chip from an inkjet or laser
scanner printer? These chip designs can be dynamically generated to
implement some given protocol, and then you have your entire
(specific) experiment printed on a mask (or laminated and such- there
are many methods to accomplish this). Where would this fit in? I
suppose it's kind of like "directing the entire lab on what to do,
except all of the instructions are pre-written in cache buffers since
it's so simple to print off a new version."

Josh Perfetto

unread,
Apr 19, 2009, 10:45:08 PM4/19/09
to DIYBio Mailing List, ben lipkowitz
On 4/19/09 6:27 PM, "ben lipkowitz" <fe...@sdf.lonestar.org> wrote:
> The "$25 thermocycler" didnt include a fan. It was just a light bulb. I
> think it was open-air so they could have convection going through, which
> would reduce the heating rate too. What I'm suggesting is to open a hole
> which lets cool air through. I think this will heat and cool faster than
> any reasonably DIY aluminum block design. I will do some experimentation
> if I manage to get away from the computer for a while.
...
> Why would you be running multiple reactions with different volumes? Do you
> actually do this on a regular basis?
>
> I think the small thermal mass of the tube means that it will spend much
> more time at the desired temperature than when changing temperatures.

I'm sure you could come up with a much better design than the $25
thermocycler. You could prove me wrong on this, I just intuitively worry
that such a device will not have very precise control of the tube
temperature. If you introduce the fan now you have to worry about
turbulence patterns and convection currents, which of course depend on the
exact shape of the chamber. I don't usually deliberately run different
volumes, it's just an example of another sensitivity of this approach.

> I came up with this design because of the limitations of homemade aluminum
> blocks, not as a 'bare minimum' design. It looks like the commercial
> blocks are intricately machined in order to reduce thermal mass. I don't
> see how we could do this without a CNC milling machine or excessive cost
> to get it made professionally.

Ah, now I see where you are coming from and why you were so concerned about
the shape of the wells. I was thinking that the components of the open
thermocycler would be either off-the-shelf or sourced from a professional
machine shop, and then perhaps sold in kits like the open gel box to be
assembled at home. If it is to be constructed in an entirely homemade
manner, I can see how the well shape would be problematic.

-Josh


Cory Tobin

unread,
Apr 20, 2009, 12:21:30 AM4/20/09
to diy...@googlegroups.com
>> I came up with this design because of the limitations of homemade aluminum
>> blocks, not as a 'bare minimum' design. It looks like the commercial
>> blocks are intricately machined in order to reduce thermal mass. I don't
>> see how we could do this without a CNC milling machine or excessive cost
>> to get it made professionally.

Yeah, the commercial machines definitely have been intricately
machined. Here are some close-up pictures of my thermocycler's
block...
http://www.its.caltech.edu/~ctobin/DSCN1108.JPG
http://www.its.caltech.edu/~ctobin/DSCN1109.JPG
http://www.its.caltech.edu/~ctobin/DSCN1110.JPG

In these pictures it looks like they have drilled other holes in the
metal (besides the holes for the tubes) to decrease the mass of the
block. Does drilling holes into aluminum require something more than
a standard power drill from Home Depot? (I don't know the first thing
about metal work)

Another thing to consider is that the standard PCR tube is tapered at
the bottom. So drilling a cylindrical hole into a piece of aluminum
won't fit the tube snugly.


-Cory

Aaron Hicks

unread,
Apr 20, 2009, 1:09:31 AM4/20/09
to diy...@googlegroups.com

Many types of aluminum are exceptionally soft- particularly free-machining varities like 6061-T6 (one of my favorites, although pretty much all of the 6000-series aluminums are pleasant to work with). It's readily worked, and machining equipment like lathes and mills have recently become quite affordable. Even the CAM stuff has dropped to the consumer-hobbyist level, such as the TAIG models:

http://www.taigtools.com/

Don't be fooled by the "mini" desigation; they can handle a piece of work up to 5.5" x 12". A CNC mill for under $2500:

http://www.taigtools.com/cmill.html

As for drilling the holes- you'd probably be best off having someone make a custom bit that is exactly the size/shape of the outside of a PCR tube, and use that for drilling holes. Pricey, but do-able; Google up "custom drill bits."

Or just have someone mill a block of plastic using some CAM software (eMachineShop or whatever), and use that for sand casting your block from aluminum using sand + oil. Once you have a positive mold from which to work, you can make pretty much as many castings as you want off of it.

Lastly, you could have one milled out of aluminum, and then use a fairly soft Shore durometer silicone to make a negative mold, and then pull a positive mold off of that using a slightly harder silicone, and use THAT for your casting. I buy the silicone mold stuff off of eBay. It's surprisingly easy to work with.

-AJ

ben lipkowitz

unread,
Apr 20, 2009, 1:12:57 AM4/20/09
to diy...@googlegroups.com
On Sun, 19 Apr 2009, Cory Tobin wrote:

> In these pictures it looks like they have drilled other holes in the
> metal (besides the holes for the tubes) to decrease the mass of the
> block. Does drilling holes into aluminum require something more than
> a standard power drill from Home Depot? (I don't know the first thing
> about metal work)

for drilling this many holes, and locating and sizing them accurately, you
would need a drill press. I'm more concerned with how to cut out the
little tube shape sticking up.. it can't be just a tube inserted into a
hole as that would have poor heat transfer and defeat the purpose of using
a solid block in the first place. maybe "alumalloy" aluminum solder?

> Another thing to consider is that the standard PCR tube is tapered at
> the bottom. So drilling a cylindrical hole into a piece of aluminum
> won't fit the tube snugly.

One can easily grind the shape of a drill bit with a bench grinder to do
weird tapers like this.

Josh Perfetto

unread,
Apr 20, 2009, 1:46:08 AM4/20/09
to DIYBio Mailing List, Cory Tobin
Wow those plates are quite advanced, the thermocycler's I've used didn't
have those mass-decreasing areas cut out. The design in the images you sent
probably enables even faster cycles, but may not be required if it's too
difficult to make.

This is just a random thermocycler I found oun the web, but you can see that
the plate design of this one doesn't have that:
http://www.labrepco.com/therm_stand_t1.php

Better pictures on first page of this PDF:
http://www.labrepco.com/docs/T1-Flyer.pdf

Or maybe we could compromise and drill a single mass-decreasing cylindrical
hole inbetween each square of 4 tapered well holes - it won't remove as much
mass as yours does, but may be cheaper.

-Josh

Josh Perfetto

unread,
Apr 20, 2009, 2:51:36 AM4/20/09
to DIYBio Mailing List, Aaron Hicks
I was just trying to find out the exact shape of a PCR tube.  I eventually found the “PCR sample tube” patent 6015534 ( http://www.patentstorm.us/patents/6015534.html ) which says the tapered portion of the tube (which is the portion that holds the sample and which must make contact with the thermocycler) is an approximately 17 degree cone.  The sample tube cone does not extend to a point, but eventually rounds off.  The thermocycler well cone extends a little longer before also rounding off (in the example given, there is another cone with a bigger angle at the bottom of the well, so it extends to a point much sooner.  This leaves a small air cavity at the bottom of the well when the tube is placed in (and I am guessing that the bottoms of PCR tubes from different manufacturers are not standardized as much as this 17 degree cone so some excess cavity is probably required to make the different tubes fit).  Figure 15 shows a really nice picture of this.

I don’t know anything about metal working but I wonder if you might be able to find a 17 degree cone drill bit already made.

-Josh

Jake

unread,
Apr 20, 2009, 4:19:50 AM4/20/09
to DIYbio
Wow, lots of posts. My thermocycler doesn't have a fancy block. It's
just a standard block of aluminum with holes drilled in it. They're
rounded and tapered like an eppendorf tube.

One thing I was thinking about was block material. What about using
copper? Aluminum is kind of hard to cast. And casting might be a lot
cheaper than milling. You could also do a fancy block design that
way.

Mine doesn't have a fancy block, so I know it's not required.

I'll tear mine apart so I can take some pictures, find out how much
wattage is going to the peltier junction, and hopefully more info
about how it's controlling the block.

Maybe someone with a heated lid can take some measurements and see
what the lid's doing in comparison to the block?


Meridith, it seems like you're giving up on having this device fun in
stand-alone mode. I think that's an important feature. I never
thought of the computer as doing any more than loading programs into
the device. Of course there could be a monitor mode and it would be
nice to have all the data from sensors for debugging and clibrating
the device.

Anyone can scrounge up some push buttons and LED lights, so I think it
would be good to try and maintain some sort of interface on the device
itself. I'm not talking a display, just a few lights, and some
buttons.

About the ethernet... Is it going to cost $20 extra over a non-
ehternet system, or is that just the cost of the parts for the
ethernet-enabled board?

I was hoping this thing could be built for around $200.


Ah ha!
http://www.labrepco.com/therm_stand_t1.php

"Note: In general, the lid temperature should be about 10°C above the
highest temperature in the protocol. Example: If your denaturation
step is at 95°C, enter 105°C for the lid."

That solves the heated lid question.

The link above has an interesting thermocycler and some details. I
think we should be shooting for something like this. It has a serial
port and a printer port. It also comes with some windows software to
control the device.

There's also this one:
http://www.labrepco.com/therm_personal.php

It's a little smaller. The site mentions that their blocks are
interchangable! That means that maybe we could just order a
combiblock from them for use in an openthermocycler. They even have
gold-plated silver blocks. Pretty cool.


-Jake

Jake

unread,
Apr 20, 2009, 4:33:22 AM4/20/09
to DIYbio
I found this... http://www.bestlabdeals.com/ProductDetails.asp?ProductCode=32321&click=35

It's a thermocycler block for $274.60. It looks like it includes the
block, peltier junction, and heat sink, but no fan.

Too bad they stick it to you on every little part. Even after
spending a couple thousand on their machine they can't even make
reasonably priced replacement parts for it. Sad. Shows we really
need an open thermocycler.

Jake

unread,
Apr 20, 2009, 4:47:46 AM4/20/09
to DIYbio
Maybe one of you CAD experts can check out this site or another one
like it...
http://www.emachineshop.com/

One of the examples looked kind of like what we might need (just for a
priced idea) and was $13.98 each in quantities of 50. It was under
the sample prices page.

Will someone design something and come up with some price ideas for
different sized blocks?

Josh Perfetto

unread,
Apr 20, 2009, 5:09:47 AM4/20/09
to DIYbio
On Apr 20, 1:19 am, Jake <jakes...@mail.com> wrote:
> One thing I was thinking about was block material.  What about using
> copper?  Aluminum is kind of hard to cast.  And casting might be a lot
> cheaper than milling.  You could also do a fancy block design that
> way.

The thermal properties aren't quite as good:

Aluminum: 900 J/kgK Specific Heat * 2710 kg/m^3 Density = 2.439E6 J/
m^3K
Copper: 385 J/kgK Specific Heat * 8960 kg/m^3 Density = 3.450E6 J/m^3K

So copper would store more heat, making the cycle time slower, unless
a fancy block design compensates for this. There is a nice chart of
metals on page 12 of this PDF: http://www.melcor.com/pdf/Thermoelectric%20Handbook.pdf
. I think you want high thermal conductivity and low density *
specific heat. Silver looks like it might be the best. Solid silver,
anyone? :)

-Josh

Thomas Knight

unread,
Apr 20, 2009, 7:55:54 AM4/20/09
to diy...@googlegroups.com, Thomas Knight
Just a thought -- if you make the block out of gallium (or Wood's
metal) it will melt at PCR temperatures, and form an effective way to
transfer heat to the tubes. Or you could fill a hole in another
material with gallium.

It is loading more messages.
0 new messages