python - performance - finding all points within certain distance by lat/long -


i have csv file points tagged lat/long (~10k points). i'd search points within given distance of user/specified lat/long coordinate — say, example, centroid of manhattan.

i'm pretty new programming , databases, may basic question. if so, apologize. performant search in pure python without using database? in, read csv memory , search python script? if performant, scale number of points increases?

or infeasible in python, , need investigate using database supports geospatial queries?

additionally, how go understanding performance of these types of calculations can develop intuition this?

this possible in python without databases. recommend using numpy. following:

  1. read points csv numpy array
  2. calculate distance of each point given point
  3. sort distance or find 1 minimum distance using argmin

because calculations vectorized, happen @ close c speed.

with okay computer, i/o take 2-3 seconds , calculation take less 100-200 milliseconds.

in terms of math, can try http://en.wikipedia.org/wiki/haversine_formula


Comments

Popular posts from this blog

css - Which browser returns the correct result for getBoundingClientRect of an SVG element? -

gcc - Calling fftR4() in c from assembly -

Function that returns a formatted array in VBA -