DB Limit?

Suggestions for WiGLE/JiGLE/DiGLE

15 posts • Page 1 of 1

Postby TROMOS » Sun Jul 02, 2017 7:38 am

Every time I try to start WiGLE, I get a "Fatal DB Problem" message. I have sent the error report via e-mail. I notice that the size of my SQLite database is now 2 gigs. Is this a hard limit? If so, how do I carry on?

Postby arkasha » Sun Jul 02, 2017 3:58 pm

There's no hard limit on our side, but I'll do some research into the SQLite DB limits in various implementations.
Is there space left on the device to keep growing?

Postby TROMOS » Sun Jul 02, 2017 6:38 pm

Storage settings are reporting 6.85 GB available (out of 14.46). so that is hopefully more than enough.
Thanks for looking into the problem.

Postby arkasha » Mon Jul 03, 2017 12:04 am

Could be DB corruption as well - it happens occasionally. Still researching here, will post suggestions as they come up:

Options if you're impatient::
1. you haven't lost any data you've uploaded - all of that is in the CSV (gzipped) files sent to WiGLE- those should still be cached locally (this might be a good time to back them all up)
2. the "only mine" search in WiGLE will vend trilaterated data all back to you any time you want (inconvenient, but gets you your points back)
3. Currently, the client can get a previously-seen list back on re-install with valid credentials - it doesn't get your observations back, but you'll have a valid DB pointer. we're working on improving this in future releases.

Cheers, and watch this space for updates.

Postby TROMOS » Mon Jul 03, 2017 12:30 am

Thanks for your efforts. I would already have tried clearing everything and re-installing, but I had some problems when I went through the process of importing my observations almost a couple of years ago. I had over 2 million observed and was hitting an import limit. Eventually bobzilla bumped the limit to 3 million and I was able to complete the download. I now have well over 4 million, so unless the limit has been further bumped or removed, it will fail again.
Refer to topic 'Import observed appears to have limit' in this forum (September 2015).

Postby bobzilla » Tue Jul 04, 2017 3:02 am

It's still on our todo list to make that import chunked, so that is still an issue. The error you are getting is "SQLiteDiskIOException: disk I/O error (code 1546)" which translates to: "(1546) SQLITE_IOERR_TRUNCATE: The SQLITE_IOERR_TRUNCATE error code is an extended error code for SQLITE_IOERR indicating an I/O error in the VFS layer while trying to truncate a file to a smaller size."

Since it's trying to reduce file size, I would guess that's an error of the hardware itself. Android is formatted fat32 with a 4GB limit, so that should be the hard limit on file size. That being said, if there's some code using a signed 32-bit then it'd only be able to reference 2GB. What is the exact byte size of the wigle sqlite file? 2^31 = 2,147,483,648.
-bobzilla - WiGLE.net just a little bit
Image

Postby arkasha » Tue Jul 04, 2017 5:44 pm

Some VFAT implementations apparently use an unsigned int, giving them a 4GB limit, but I can't find conclusive guides to this.

We'll prioritize chunked handling of observation import; I've already done half of it here (streaming JSON parsing): https://github.com/wiglenet/wigle-wifi- ... g/pull/138, slated for the next WiWiWa release. We just need a streaming download handler.

It's a hack, but since we don't restore every observation, the deleted-and-re-imported DB will actually be much smaller than the original.

Postby TROMOS » Wed Jul 05, 2017 8:33 pm

Sounds good to me. I have deleted all and re-installed. Getting by for now with a new database, the only drawback being that I can't tell if I've hit rich pickings or just collecting lots of repeats. I can live with that for the time being as I will probably replace the device soon (very poor at hanging onto a GPS signal). By the time I get my new toy, the chunked download will hopefully be available.
Many thanks.

Postby TROMOS » Wed Jul 05, 2017 8:58 pm

Hi Bobzilla,
I had copied all the files to an external HD before wiping. Win 7 reports the size of the SQLite file as 2,148,102,144 bytes. Not exactly 2**31, but maybe near enough to raise suspicions?

Postby arkasha » Sat Jul 08, 2017 6:03 pm

Well, that bumps a streaming import function up the priority list - thanks for letting us know!

Out of curiosity, if you access the SQLite file on a file system that supports large files, is the database intact?

Postby arkasha » Sun Jul 09, 2017 7:03 am

I've committed a first draft of full stream-parsed observation download as a candidate feature for 2.20. It's efficient enough that I can load my own observations on even more most limited testing devices.

As always, feedback is greatly appreciated!

https://github.com/wiglenet/wigle-wifi- ... g/pull/151

Postby TROMOS » Sat Aug 05, 2017 6:44 pm

Sorry for the delay in getting back to you, hectic month! 'DBbrowser for SQLite' under Win7 on my laptop seems to read the file OK. I haven't tried anything that would be more taxing than a simple read and display.

Postby arkasha » Sat Aug 05, 2017 7:06 pm

hm, ok, so that solves that; we're probably being denied writes once the DB + new commit reaches the FS limits.
We'll need to find ways to partition this in the long run.

Bright side: if you use the new app version / streaming download to establish a new DB, it won't contain all the old *observations*, but it should be able to start a fresh (much smaller) version of the DB for your to continue stumbling, and will contain a record of every *network* you've seen so far. You're at the edge of the size Android can support for a local observation DB simply based on FS type; we'll have to figure out how we're going to shard these going forward, I guess.

Until we come up with a design that makes everyone happy, I recommend keeping the backup SQLite file and starting a fresh one. Re-downloading your networks should result in a much smaller DB (and probably improved stumbling speed, too!). The streaming download broke support for Gingerbread devices (yes, they're still out there!) so we'll probably restore the old code path with a version check around it.

Cheers,

-Ark and the WiGLE team.

Postby TROMOS » Sat Aug 05, 2017 7:22 pm

Thanks. I have started a new DB from scratch. Despite not getting around much last month it is already at 124MB! About the only drawback is that I can't really tell how well I'm doing until after I upload as the previous guesstimate of 60-70% of 'new' translated to new first finds is no longer the case. On the plus side, as you pointed out, is a snappier response particularly on startup where it is ready to go instantly rather than a quarter of a minute waiting for the DB to load. I might just get used to this and archive off the DB and start a new one every time it hits a gigabyte.

Postby arkasha » Sat Aug 05, 2017 7:46 pm

The "download observations" function should work fine for you now (we rewrote it in 2.20) - if you want to know your new/seen stats beforehand, but only if you want to pay the performance penalty vs. just keeping a clean DB!

Cheers,

-ark

15 posts • Page 1 of 1

Return to “WiGLE Project Suggestions”

Who is online

Users browsing this forum: Baidu [Spider] and 31 guests