Hi,<br><br>I'm study-ing TinyOWS as server WFS,<br>using qgis as wfs client.<br><br>I notice a procedural issue when the dataset is huge.<br><br>It need a very much time to startup on first visualization.<br>The qgis plugin need to download all the dataset.<br>
The only solution to avoid this time lost, is activate the "limit-feature" option.<br>But it is a strange solution.<br>Infact the qgis WFS-plugin download the first feature untile the limit set, and after don't show any other dataset.<br>
If I try to zoon to a details if the detail is out of the first downloaded feature, simply qgis show nothing.<br>Perhaps this is an issue of qgis plugin, but however I guess the "limit-feature" solution for huge dataset is not a <br>
solution in many use-case.<br><br>I guess a better approach should be a <br>dimension approach.<br>Allowing the size in unit of length of the max dataset retrievable.<br><br>something like<br><pre class="wiki"><limits unit-of-lenght="1000" /> (dont send when the portion asked is more than a square of 1000x1000 unit-of-lenght)<br>
<br></pre>For example setting max retrievable size = 1000 meters,<br>the wfs server should be send data only if the bbox asked has a size under<br>1000meters x 1000meters<br><br>Without see how much features it could contains.<br>
<br>Of course this mean to set something like a scale-dependent limitation,<br>but this in many use-case is more affordable then a limit-features .<br><br>Regards,<br><br>-- <br>-----------------<br>Andrea Peri<br>. . . . . . . . . <br>
qwerty àèìòù<br>-----------------<br><br>