Hi
I have a raster grid and a polygon layer. I want to calculate Mean value for each polygon from the raster grid.
Simply i create a loop through each raster cell and check if polygon intersect that cell. if so add cell value to sum and finally calculate mean by sum/cells in polygon.
But i think there is something wrong, because the results i get by this way is much different from result of ArcGIS 'Zonal Statistics' method. also DS clip raster by polygon gives me a new raster which has statistics very close to ArcGIS.
So is there problem with polygon.intersects? or my implementation is wrong?
I have a raster grid and a polygon layer. I want to calculate Mean value for each polygon from the raster grid.
Simply i create a loop through each raster cell and check if polygon intersect that cell. if so add cell value to sum and finally calculate mean by sum/cells in polygon.
But i think there is something wrong, because the results i get by this way is much different from result of ArcGIS 'Zonal Statistics' method. also DS clip raster by polygon gives me a new raster which has statistics very close to ArcGIS.
So is there problem with polygon.intersects? or my implementation is wrong?
...
var cell = new Point(raster.Xllcenter + (raster.CellWidth*col), raster.Yllcenter + raster.CellHeight*row));
if (polygon.Intersects(cell))
{
count++;
sum += raster.Value[row, col];
}