Page 1 of 1


Posted: Thu Feb 28, 2008 2:36 pm
by pepsijoe
Actually I need to name and track 21-27 Trillion different bits of data.
Each bit would have a value that would decrease every second until it reached zero. I need to label each individual bit of data so that once it reaches 0 it can be deleted and replaced. ANY IDEAS?

Posted: Thu Feb 28, 2008 5:02 pm
by phycowelder
21-27 trillion bits of data? wow!

my only sugestion would be maybe
arrays and pointers!


Posted: Thu Feb 28, 2008 6:02 pm
by MystikShadows
This isn't a very clear problem description. What would such a thing be used for? if we can't manage this quantity of information, knowing the purpose of what you're describing we might be able to suggest a more efficient way of dealing with it.

So let us know what it's for exactly, and let's take it from there.


Posted: Thu Feb 28, 2008 7:57 pm
by burger2227
That sounds just plain Ridiculous! How would you know if a program missed a few thousand? You wouldn't!

Don't delete each bit of data! Just delete the idea from your mind!

Posted: Thu Feb 28, 2008 9:52 pm
by Mentat
Use allocate? But there's a limit on memory, not to mention the computer's own limitations. Is this for the LHC? :D

It sounds ridiculus, but so was 103% of math and science at any point in time. So, load up a chunk (a couple gross or so megabytes at a time), process it (whatever you're doing), get rid of it and load up a new batch. I take it that there is an external source for this data, or is it being created by the computer? Whatever it is, this is no small task. I would guess you'll need a really big array or list.

Good luck for what ever this is. Pity, I haven't even a dozen gigabytes left free on my own machine.


Posted: Fri Feb 29, 2008 2:25 am
by burger2227
What are you guys talking about? Where would the user interaction come from? A select 5000 bytes shown on the screen?

I cannot BELIEVE that both of you fell for this obvious JOKE. LOL


PS: Better buy a mainframe computer and a TRILLION bits of memory!

Posted: Fri Feb 29, 2008 3:18 am
by phycowelder
maybe read/write data files, might be an option!use HD in stead of mem!

Posted: Fri Feb 29, 2008 7:29 am
by Mentat
Or group the same states into one state. So it's no good to waste cycles and bits on a couple million states that all have, say, 100 seconds. You can have the computer manually group them (through linked lists), though you probably won't have a lot of working room for the computer to keep track. Maybe only group per set. That menas the sets themselves will change over time. And if the computer has enough room, it may venture into neighboring sets to further simplification.

Sounds like a job for Assembly.

Posted: Fri Feb 29, 2008 2:25 pm
by pepsijoe
As an update...the interaction is limited at first, each bit of data is a RBC in the human body. with an average of 21-27 trillion cells. Each bit of data has a "life span" of no more than roughly 10,368,000 seconds. So as each cell reachs the end of it's "life span" it dies (gets deleted) and replaced by a new one. Very simple--complex--concept, and it is no joke.

The problem is that I can't just upload a set amount, compute and start again, because it is a nearly never ending process, so it needs to regenerate at the same time that it loses. Now that you know, any help...?


Posted: Fri Feb 29, 2008 3:32 pm
by burger2227
Life expectancy? How can you ASSUME that every cell will live for a certain amount of time? What happens if a disease is added to the mix?
I could go on, but even in the FB forum there have to be some limitations on rhetoric! Simple answers:

Men in USA = 72

Women in USA = 80

Now find something else to do that makes more sense. Even FB cannot do that kinda stuff. Try C or D or E. Ask God if you know him well enough!


PS: Lay off the caffein Pepsi!

Posted: Fri Feb 29, 2008 3:59 pm
by Mentat
Burger, it's a simulation. Duh. 8)
And as for the cells themselves, they can be grouped into age-type. There're millions (billions?) of red blood cells that are the same age and will die at the same time (roughly), so you process that group. Then when the group dies, there'll be a new batch of same-time red blood cells to replace'em. And so with neurons (same age, somewhat same lifetime). That'll do some tremendous simplification, not to mention introducing the hope for staying on the memory budget. It's a shot. With a little math, scientific insight, and some computer hacks, you can appear to 'simulate' more than what is actually going on in the computer.


Posted: Fri Feb 29, 2008 5:17 pm
by burger2227
So we can significantly lower trillions to thousands then and make a real program.

Glad you finally caught on to what I was implying LOL


Posted: Fri Feb 29, 2008 5:32 pm
by Codemss
Burger was 100% right. Trillions of bits is a ridicilous idea. There is another way I can think of to still try the things you want, pepsijoe.
Say each cell has a lifespan of 100. Then you can make an array of 100 long integers wich hold how many cells have that lifespan. Then you put them in a loop, and shidt everything one place (don't know if this is what you want).