agree. A short data sample would be more convenient for quick testing
http://www.filedropper.com/closestdestinationsample
Its 97mb because of the latitude and longitude table to pair with the postcodes.
I never started with the long/lat coordinates i had to get them from another source and then tie them together based on postcode.
If thats still an issue, i can take a static copy of the tables with all the values and pass it over that way, should make it alot smaller without the long/lat csv included.
Thanks,
Josh
Give me 20 minutes and ill have the smaller file size.
Thanks,
Josh
Hi @AlB @dedelman_clng,
This link should work now think its 60kb now.
The v2 file.
Thanks,
Josh
or here;
Thinking onwards i may need to filter these results at a later point aswell based on a DATE field so ALL() might not be the best and//or using a measure might be better.
So with the date column we could say on 22/01 if there anyone traveling with a neighbor greater than 20 miles away.
With the ALL() in the formual would that egnore the visual filter to only show house refferences travelling today as an exmaple?
Or would the visual slicer just limited the results and base the calculation on the data thats relevant?
Thanks,
Josh
If the nearest point is to be dynamic, it will have to be a measure. You would still want to use ALL, but then would have to put filters back onto the data. I'll take that into consideration when I start looking at your sample file.
I'm having a look at the file and with my approach I do get that error that @Joshua_Peter was mentioning:
"An argument of function 'ACOS' has the wrong data type or the result is too large or too small."
That's a bit weird. I guess the type is not the problem here and ACOS should throw that error if the argument is outside [-1,1] which is the valid input range for the function. I checked the values that are passed to ACOS and all fall within that range. There are some 1s so I thought it might be that due to rounding those ones might be actually 1+1e-N with N=10 or greater. I've checked up to the 15th decimal and they are all zeros. Plus from a couple of tests I've run ACOS ignores the decimals further than the 15th.
So I'm a bit puzzled as to why this error is generated. I think the approach should work otherwise
If we had a duplicate Route Information table, or a distinct values only table taking it back slightly and to emulate the working scenario with the Cinema, is there a way to filter the duplicate table to remove the same house coordinates before the formula is ran based on the MEMNO?
Would that work, then the answer cant be 0 as the address would be removed as an option per MEMNO?
Then just execute the same formula as the cinema one?
Thanks,
Josh
I think my approach solves the issue of eliminating the zero distance to the the base house itself. The problem lies with the distance calculation formula. I did a test. Instead of using the haversine formula, which is raising the error, I just used a simple approximation based on the Euclidean distance between two points and adjusting for the approximate size in km of a degree of Lat/Lon. A rough approximation probably but that is beside the point. I just wanted to see if the approach works. It seems to. Here's the calculated column:
Distance to Closest House (Miles)3 = VAR HouseLatitude = 'Route Information'[Lattitude] VAR HouseLongitude = 'Route Information'[Longitude] VAR EarthCircumference = 3959 VAR P = DIVIDE (PI (); 180 ) RETURN MINX ( 'Route Information'; VAR CinemaLatitude = 'Route Information'[Lattitude] VAR CinemaLongitude = 'Route Information'[Longitude] VAR _DistanceFromCurrentRough = 80*SQRT(POWER((HouseLatitude-CinemaLatitude);2) + POWER((HouseLongitude-CinemaLongitude);2)) RETURN IF ( _DistanceFromCurrentRough <> 0; _DistanceFromCurrentRough ) )
This provides results that look meaningful.
On top of that, I tried a variation that does not exclude the zero distances obtained:
Distance to Closest House (Miles)3 = VAR HouseLatitude = 'Route Information'[Lattitude] VAR HouseLongitude = 'Route Information'[Longitude] VAR EarthCircumference = 3959 VAR P = DIVIDE (PI (); 180 ) RETURN MINX ( 'Route Information'; VAR CinemaLatitude = 'Route Information'[Lattitude] VAR CinemaLongitude = 'Route Information'[Longitude] VAR _DistanceFromCurrentRough = 80*SQRT(POWER((HouseLatitude-CinemaLatitude);2) + POWER((HouseLongitude-CinemaLongitude);2)) RETURN _DistanceFromCurrentRough )
and this, as expected, yields zero in all rows due to the comparison with the base house itself which was the initial problem/question.
So that can be solved with this approach.
Now the next issue regards something in the haversine formula or in the data or in some weird behavior that raises the error of ACOS mentioned earlier. But that's a whole different story.
Agree?
Hi @AlB
Yeah i have tried the Calculated column on the full data set and prior to any slicers been applied have managed to verify the distances produced are very close to accurate, 100% accurate enough for what i intend.
Slight issue when we filter the data though, so there is a Date column which will date when these points are relevant or occured over the previous 7 days.
If i filter to only show houses on the 16th January as an example the closest distance remains the same but its closest may be from a house on the 18th January so in this scanario wouldn't be a possible closest.
Based on the conversation i did try creating a caluclated measure with the same formula without much sucsess it doesn't relate to the MEMNO so wont display the values for the closest distance in a table view using the MEMNO.
So it is working from a total dataset view but the information wont update as we filter down for a specific date or only houses with a closest cinema of Doncaster Vue as an example.
How would be make the same working formula work with filtered data, as a measure?
Thanks though from a dataset point of view its great and from what points ive validated is fairly accurate.
Thanks,
Josh
Learn the answers to some of the questions asked during the Amanda Triple A event.
October was a busy month in the community. Read the recap article to learn about some of the events and content.
Two waves of brand new solution badges are coming! Read the article for more information on our new community badges.
User | Count |
---|---|
67 | |
54 | |
49 | |
47 | |
45 |