Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm not sure I follow.

Do you mean it's hard to find a single record out of many or that new insights can be discovered in aggregate?

The former is simply not true, databases exist for this purpose. The latter seems obvious.



I mean, gathering all kinds of junk may make you loose sight of what you really are about. See for instance various self-criticism from within the three letter agencies, about how operational capabilities were lost in favour of hoovering up as much data as possible. Most of which is useless. When compiling and processing data is very painful, you make sure you compile and process the best data you can find.

I'm not saying we should go back to clay tablets or anything. I'm just saying in any rapid technology transition, we risk loosing the good with the bad.


I don't think I agree with this line of thinking.

How can we determine data is "useless" if we do not analyze the data?

New technology may have good and bad aspects but that doesn't mean it prevents us from doing what works.


Analyzing the data becomes more difficult as you put quantity over quality wrt how you're structuring data and where its coming from.

There's a database somewhere of Do Not Fly suspicious people, but how was it compiled? Should we put any weight on these lists of names?

I think this is what OP was alluding to, when it is more painful to collect and organize data, you make sure the data is worth collecting and organizing.

See also "Worse is Better"


I don’t agree that we have to make a trade off between quality and quantity of data. That seems like a false choice to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: