[600MRG] WSPRnet database

Ralph Wallio, W0RPK W0RPK at netins.net
Thu Oct 30 10:44:57 CDT 2014


Chris ---

I am not surprised that you made quick progress with this problem and am 
very happy that you continue to monitor 630m activities!

(For folks on this net that are not familiar with Chris, he was a major 
participant with our Midwest 630m ground-wave testing a few years ago 
and is our webmaster and host for http://630m.net/dev/ which is in 
current limbo pending FCC approval of a new 630m Ham band.)

I downloaded your September WSPR database segments, changed the first 
file name to 09_14_csv00.csv and loaded the result to EXCEL 2010.

This process yielded 200,000 rows as you intended.  I then sorted these 
rows by frequency values in column-F to quickly find all 630m data 
points.  I then deleted data points for all other bands.

This process yielded 9,672 630m data points.  If band proportions in 
remaining segments is approximately the same, this process would yield 
29x9,672 = ~280,000 --- worldwide --- 630m WSPR data points for the 
month of Sep14.

I then took one more step for North American-centric interests by 
sorting the 9,672 data points by transmitting grid square in column-D 
and deleting all data points not originating in North America.  This 
process yields 4,938 630m WSPR data points originating in North America.

There is more converting and sorting that will be of interest, e.g., 
Timestamp:

The time of the spot [column-B, rw] in unix time() format (seconds since 
1970-01-01 00:00 UTC). To convert to an excel date value, use 
=time_cell/86400+"1/1/70" and then format it as a date/time.

NOW A QUESTION: Could other LINUX command lines be easily used to 
similarly remove WSPR data points for all other bands to build 
(approximately) 280k .csv data point files for each month?  We EXCEL 
users could take it from there.

TNX es 73 de Ralph W0RPK
http://showcase.netins.net/web/wallio/








On 10/30/2014 10:17 AM, Chris - KC0TKS wrote:
> Larry and others,
>
> I am a Linux user and as is usually the case, there is a simple, 
> command line solution built in to the OS for this problem.
>
> As a test, I split September's WSPR spots into files of 200,000 lines 
> each (approximately 1 day) and have them available for download at 
> https://dl.dropboxusercontent.com/u/6231083/wspr%20csv.zip
>
> This zip file is almost 100 meg. so make sure you have high-speed 
> Internet (or a lot of time to kill!).
>
> For those with Linux that want to do this themselves, the command I 
> used was *split -dl 200000 xxxx.csv 09_14_csv *where xxxx.csv is the 
> name of the large .csv file you want to split. It took about 10 
> seconds to complete on my old computer.
>
> For those without Linux, send me an email and I will split whatever 
> month you are interested in. I will split and post October when the 
> month is over.
>
> Please let me know if the split files unzip properly and open in your 
> spreadsheet. Sorry, I was in a rush this morning and didn't take the 
> time to figure out how to add the .csv extention to the files when 
> they generated so you will have to either add it manually or 
> right-click on the file you want to open and choose "open with" and 
> pick Excel or whatever you want to open them with.
>
> 73,
> Chris - KC0TKS





More information about the 600MRG mailing list