Fusion Message Board

In this space, visitors are invited to post any comments, questions, or skeptical observations about Philo T. Farnsworth's contributions to the field of Nuclear Fusion research.

Subject: Tritium production and measurement
Date: Jul 19, 4:30 pm
Poster: Richard Hull

On Jul 19, 4:30 pm, Richard Hull wrote:

I have posted on this before, but did some more coagulations for a realistic requirement for tritium decay count detection in a backfilled fusor chamber following a run. (backfilled with argon to about 25mm or so to make the system its own internal geiger counter.

First let us assume some givens and discuss the technique.

Given control test:

Chamber is exhausted ~10-4 torr
Chamber is at room temperature
Fill to 1 micron with D2
Fill further to 25mm or so with argon.
Connect fusor as GM detector tube.
take long term count.
record data.

Technique:

Chamber is exhausted following count run to 10-4 torr again.
Fill to 1 micron with D2 and valve off at run time.
Run fusor for fusion and take neutron count data.
At end of run allow system ot reach room temperature.
Back fill to 25mm with argon and connect as GM counter.
Take data over same run time as in first count test.
Compare results.
Compute tritium content.



This is a well defined procedure but will it work?



Let's do some thinking. In our thinking we will round off for grins and googles.

Any quantity of tritium will require about 3-5 half lives to be mostly disposed of. Let us assume 4 half lives or about 40 years. This reduces to about 21 million minutes. To warrant 1 count per minute from tritium in our fusor we might hope for 21 million atoms of tritium. Hopefully, we, as amateurs and using good technique, could make statistical significance of this in our differential measurement above.

Note* I realize the tritium decay is exponential, but choose worst case linear logic to allow for an ever rising confidence in the results. This way we get a nice surprise after the run and good statistical significance with high confidence.

If the above is near the actual case, and we achieve a uniform and continuous rate of 10e5 n/s, that also means 10e5 tritium atoms/second. We would have to run the fusor for 210 seocnds or 3.5 minutes.

Realistically, such a rate, while doable, is not sustainable in a simple fusor. Based on my experience, a 30 minute well nursed run would be needed. I have had moments in a timed run when things are going wrong (too much gas, too low voltage or current). Fusion rates have fallen percipitously. About the longest and most grueling run I have ever had might be on the order of 30 minutes. This might allow a statistically significant tritium level to be measured via decay in GM mode.

If you make the run, you would have to let the system cool down to ambient before backfilling with argon for the tritium count. You don't want hot excited gas atoms giving false or boosted count rates.

I hope to arrange fusor IV's gas system to do just this sort of test. Such a low budget decay run test might be about as sensitve to tritium as an expensive RGA sniffing away in the gas.

With a good and expensive modern RGA and its electron multiplier option 1 in 10e11 detection is possible. The gas load in a fusor might 10e17 gas atoms. It would be close. Only the RGA would crud up and the electron multiplier would be useless at 1 micron. A separate vacuum chamber would have to be taken to 10-11 torr and a sample of fusor gas bled in to even run the RGA test.

So the tritium radiation test looks good for a good productive fusor run, if the data is carefully taken and the operator forced to work on a shoe string budget.

Richard Hull