Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well to be pedantic it will grow, at worst, to (24 * 60 * 60 * 1000000000), if you are calling increment() every individual nanosecond. Though you are right that it is ignoring realistic memory issues. I was approaching it more as a thought exercise to address the moving window.


No, that's just the maximum possible in a single day.


Your bigger problem is with this "trim the array" idea, which is definitely not an obvious solution. The way you've coded it you'd have to tally almost the entire days worth of deltas just to determine the trim point. And you may still exceed that single-day max memory, because you'd accumulate overruns in between trims. I'll leave you to think about that one.

(Hint: google round robin database. You know, the solution that I mentioned.)


Interestingly, based on the explanation at [1] round robin databases actually don't precisely solve the question as described in the article. It sacrifices precision for larger increments, and the author was looking for an exact algorithm.

1: http://jawnsy.wordpress.com/2010/01/08/round-robin-databases...


My comments, as mentioned, were ignoring real memory constraints. I don't know why you feel the need to come off as frustrated.


It seemed obvious to me that you would trim it if it became larger because the most it asks for is the count of a single day.

However, now that I think about it increment() could called multiple times during a single nanosecond, in which case I guess it would be multiplied by whatever the maximum number of executions per nanosecond would be. My once per nanosecond comment earlier was an erroneous assumption.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: