• Please review our updated Terms and Rules here

TI-74 stats question...

JGardner

Experienced Member
Joined
Jun 14, 2009
Messages
200
Hi - In 1986 I bought a TI-74, and I'm still not done with it...

I'm working on a program to extract various statistical indices from
a table of data.

Data occurs in the range (40 < n < 450). Data outside
this range is not accepted by the routine. The vast majority of the data will
occur in the range (80 < n < 250).

The value is manipulated thus... n = INT( n / 1.8 + .5 ) and the result
stored as one byte.

INT(expression) returns the the largest integer less than or equal to
the expression.

When the value is retrieved for summation, mean, std deviation, and so
on the value is pre-processed as n = INT( n * 1.8 + .5 ).

My experience with this scheme is that the value returned is always
within +/- 1 of the original value, with the variation seemingly evenly
distributed.

Data is evaluated in subsets, ranging from ~100 to 600 values, typically.

Individual datum vary in accuracy as follows...

90% of the data are within 10% of the actual value. All of the results are
within 20% of the actual value.

My sense is that the data compression scheme is not materially affecting
the results, but I don't know how to address this rigorously, so I'd appreciate
any input - Or even better, a pointer to how to evaluate the scheme.

thanks, Jack
 
Judging from the number of looks your topic has had, it appears everyone else is like me, interested, but haven't a clue on how to resolve the problem.

I've taken way too many hours of statistics, and have tried to flush any memory of the class after it was over.

As far as the TI-74 stats module, I've never heard of anyone commenting on the inaccuracy of the results.

Sorry I couldn't be of more help.
 
I've taken way too many hours of statistics, and have tried to flush any memory of the class after it was over.

Yeah me too. All I recall is that it had something to do with flipping a coin and got very complicated when calculus was added to the mix.

Perhaps run through the routine on paper to make sure your algorithm is sound?
 
Thanks guys.

I'm not using the stats module. I wrote the stats routines in BASIC (runtime ~ 35 secs) to proof the concept,
then rewrote the Sum & Sum of Squares routines in assembler (runtime < 2 secs), which is fast enough for
the intended purpose.

I'm pretty sure the results are good enough for said purpose - In the interim I've dug out one of my 40-year-old
college textbooks, and am doing some homework. Interesting stuff.

Thanks for responding.

Jack
 
Brother,

If you find statistics interesting, there's something wrong with you!!! :biggrin:

From what I remember, there are lies, damned lies, and statistics!
 
> there's something wrong with you!!!

Actually got interested again digging into DSP theory, but the data being mined is
blood glucose measurements; a personal interest of mine, you might say :cool:

I'm going to port the pgm to the CC40 - Got a 128K cartridge in the works...


Jack
 
Back
Top