The 7 Things I Know About Cyber Security After 25 Years | April 2024
The Loading and Maintenance of Cadastral Data in the Spatial Data Infrastructure (SDI)
1. The Loading and
Maintenance of Cadastral
Data in Manitoba’s Spatial
Data Infrastructure (SDI)
Bob Bruce & Tony Viveiros
FME World Tour 2014
April 11
2. Presentation Outline
• Background to the SDI Project
• Description of the Cadastral Datasets
• Implementation of Table and Feature
Class to house the data
• Detailed description of Workbench files
and their operation
• Next steps
2
3. What is a SDI?
The technologies,
policies and people
necessary to enable the
use of spatially
referenced data through
all levels of government,
stakeholders and the
public.
SDI Vision:
Manitoba’s source
for authoritative
geospatial data.
3
4. Manitoba SDI Objectives
1.To build (design and implement) a SDI for the
Province of Manitoba that:
– stores, manages and disseminates Manitoba’s
geospatial data across government and to the public
– is scalable to meet anticipated future needs
2.To provide a SDI Governance model that
clarifies the tasks and policies required for
SDI success.
4
9. The Cadastral Datasets
• Accurately computed cadastral datasets
in Winnipeg
• Accurately computed cadastral datasets
in Manitoba
• Manitoba Reference Grid
• Northern Quarter Sections
9
10. Cadastral Datasets in Winnipeg
• Maps of areas of Winnipeg on the MLI• Area 6 of the Winnipeg Cadastral
Mapping
10
11. Cadastral Datasets in Manitoba
• In the MLI we
currently start by
accessing data
from a map of
the province to
go to subareas
• By clicking on a
subarea we go
to a map that
allows us to
select and
download data
from a dataset
11
13. Northern Quarter Section Grid
A computed
theoretical quarter
section grid of the
unsurveyed north
13
14. Maintaining a Seamless Cadastral
Polygon Layer
• Each of the cadastral polygon fabrics have
been fitted together and are ready to be
combined into one polygon fabric
• These fabrics are combined into a single
GeoDatabase polygon feature class
• The individual datasets are tracked with a
database table 14
15. Cadastral Jobs Table
A record of which cadastral datasets are
currently in the cadastral polygon feature
class
15
16. Workbench Files
• One workbench file to ‘decide’ whether
an input dataset is a new one or a
replacement one and run the processes
to populate the feature class and job
table
• Once workbench file (called by the first
one) to actually insert a dataset into the
polygon feature class
16
23. • Develop processes to import the
Northern ‘theoretical’ quarter section
grids
• Develop a workbench file that uses the
Directory and File Pathnames reader to
scan folders for datasets and call the
workbench files
Further Improvements
23
SDI Vision: Manitoba’s source for authoritative geospatial data.
SDI Operational Environment.
Provide ArcGIS Desktop in a CITRIX environment. This will ease deployment issues, offer 64bit computing benefits, and improve performance for users with limited bandwidth. ArcGIS Desktop creates a high amount of network traffic when working with data over the network, this causes poor performance when using network data on a slow connection. CITRIX will improve this on this problem because it is installed in the same data centre where the data is located making it as close as possible to the data over the network. The CITRIX client connection is optimized to provide a quality screen display on the client computer. Printing is done on the client computer to a local printer/plotter.
A SQL Server Enterprise Geodatabase (ArcSDE technology) will be available for Operational GIS usage. Each department will have its own database and security will be in place to give users a working area. All Esri vector data can be moved into the Enterprise Geodatabase. Sandbox testing will determine limitations. Working in this environment offers features that most users have not previously had access to in an Enterprise environment – multiuser editing, user security model, long transactions, and versioning. The data in the database will be backed up regularly. The design is scalable so that more server capacity can be added to meet growing demand.
Network file storage will be allocated for the centralization of GIS data. The expectation is that remaining operational vector data on local drives or existing network storage which is not immediately going into the Enterprise Geodatabase will be moved to this new location.
The new network and database storage will be available to CITRIX on a fast network link and will also be available to existing ArcGIS Desktop users over their provided network connections.
Imagery Storage: On bottom of screen a storage component for imagery and LIDAR will be added to provide storage for the SDI as well as desktop users. File based raster/lidar data will be stored here. Where possible, mosaics will be created that will be used by the SDI imagery services as well as on the desktops. This exists outside of a particular zone because it will be used by all.
Spatial Data Warehouse (SDW) center of slide. Provides source of authoritative data released by owners for use by other internal GOM users. SDW contains an Enterprise Geodatabase to store/serve published data and also contains an ArcGIS Server to provide this data through spatial web services. The spatial web services will be used by the internal web based dataviewer as well as GIS desktop clients. Metadata will be provided in this environment to provide additional information about the spatial datasets.
Internal GIS users can access the information using the GIS on their desktops or CITRIX. Non GIS users can access the data using a the internal webbased dataviewer.
Publishing environment is similar to SDW but located outside of the internal PDN to be accessible from the Internet.
External GIS users can use the spatial webservices from the PUB environment in their GIS, non GIS users can use the external webbased viewer to consume the spatial web services. External users will not be able to access the enterprise geodatabase directly as the internal GIS users can access the SDW. Internal users can access all aspects of the PUB environment. Metadata will be provided in the PUB environment.
The SDW and PUB ArcGIS Servers will publish imagery web services from source data stored in the Imagery storage at the bottom of slide. This maximizes usage by allowing the data to be used by all components of the SDI as well as desktop clients, and minimizes storage requirements by only having to store it once.
Extract Transform and Load (ETL) processes will be in place to move data from one environment to another.
Operational is considered a working area for data production. Select datasets in the Operational will be identified to be hosted in the SDW. Once data is ready to be moved into the SDW, ETL scripts will be developed to ETL the data into the SDW. The process can be one-time, scheduled frequency, or on demand. The data flowing from Operational to the SDW can be as is or can be manipulated in content and/or structure before it gets loaded into the SDW. Once it is loaded into the SDW it will be readonly visible to GOM. Spatial webservices will be configured to display the information as mapservices, featureservices, WMS/WFS...
The ETL process from SDW into PUB is very similar.
Winnipeg is subdivided into 342 individual areas each maintained separately with a total of 256,900 polygons
Manitoba outside of Winnipeg currently has 323 individual areas with a total of 290,500 polygons
The Manitoba Reference Grid is the original grid of quarter section mapping in the surveyed area of Manitoba combined and then subdivided into 9 areas. These tiles are maintained by making corrections as gross errors are discovered and areas are removed from them as areas of accurate cadastral mapping are created. The edges of these tiles are adjusted to fit the areas of accurate cadastral mapping that are added to the mapping of Manitoba.
The Northern Quarter Sections are calculated using the few positions of known surveyed points and by fitting the design dimensions of the township system to their positions and extending the design dimensions outward from the locations of known surveyed points
Winnipeg is subdivided into 342 individual areas each maintained separately with a total of 256,900 polygons
Winnipeg is now covered by mapping areas which are individually maintained as new survey plans are registered with The Property Registry
These graphics show how tiles are accessed via the MLI (Manitoba Land Initiative) by first selecting an area of the city from the overall map then by selecting a tile from a subarea
Manitoba outside of Winnipeg currently has 323 individual areas with a total of 290,500 polygons. Each of these individual areas is maintained separately as new survey plans are registered
We select a subarea of the province from the overall index map. From a subarea we can select individual datasets. The table below the map shows some details of the selected dataset including the version number
This fabric is a rough approximation of the township system and it is used to provide a map of this system to external users. It is not an ideal mapping product but it is the only one that we have. At various times gross errors are discovered in this grid and are repaired and the revised fabric is made available via the MLI.
The fabric was subdivided into 9 tiles in order to make it more manageable and it is maintained and made available via these tiles and also as a combined product. This fabric is also fitted to the areas of the accurate cadastral mapping that were described in the previous slide and portions of the reference grid are removed when new cadastral mapping is produced and ‘holes’ in this grid appear in these areas. Tiles in this fabric have version numbers and when they are revised the version numbers are incremented.
The graphic on the slide shows the page on the MLI where the fabric is made available to the public.
The product is a computed, topologically structured, theoretical quarter section grid of the unsurveyed north. Coordinates are in the NAD83 UTM projection.
The Northern Quarter Sections are calculated using the few positions of known surveyed points and by fitting the design dimensions of the township system to their positions and extending the design dimensions outward from the locations of known surveyed points
This grid is fitted to the Reference Grid and any accurate cadastral datasets that abut it
The sole exception to the fit of the polygon fabrics is the City of Winnipeg cadastral mapping. It was produced and still is maintained in the June 1990 computation of the NAD83 coordinates. Since that time several more accurate sets of coordinates for survey control have been produced and the coordinate set used to produce the City of Winnipeg is relatively inaccurate compared to the others. This means that the coordinates along the boundary between the City of Winnipeg cadastral mapping and the surrounding Manitoba accurate cadastral mapping are incompatible and may produce overlaps. How this situation will be resolved is still not determined
A single enterprise GeoDatabase polygon feature class provides the ability to combine all of the polygon fabrics and makes it available to all users within and external to Manitoba
This is the database table that records the current versions of the individual datasets contained in the cadastral polygon feature class. This is used to track the current version of the datasets and it will also be available to users to query the other attributes of the datasets such as, the latest survey plan numbers mapped in this version and the number of parcels that are contained in the dataset.
The FME is used to populate and update this table using the attribute information that is produced during the cadastral mapping operations.
The first workbench file checks the dataset version that is contained in the input data and looks for this dataset being present and for the version being a newer one than the one that is already present. This file calls the second workbench file to put the input dataset into the feature class if it is a new dataset or it removes the polygons for the dataset if there is a newer version of the file to be put into the feature class and then calls the workbench file to insert the new data.
The second workbench file is used to fill in fields that are not present in the input and performs checks and repairs on coordinates in the data (more on that later)
This operation has two main areas:
- the decision of which kind of operation is being done (inset of a new or update of existing dataset) and the insert of a new
dataset – this area is in the purple bookmark area
- the update of an existing dataset – this area is in the green bookmark area
The transformers circled in red are the WorkspaceRunner instances that run the second workspace to actually insert the new cadastral polygons into the feature class
This is a screen capture from running the process to update a City of Winnipeg cadastral dataset. In this case the data flows through the steps to process updates and City of Winnipeg data.
This process takes three input parameters: which area of Manitoba the accurate cadastral data is in (Winnipeg or Manitoba), the ACCESS database that the jobs tables are stored in, and the input SHAPE file.
This has three readers, one SHAPE file reader for the cadastral polygons and a reader for the Manitoba jobs table and the Winnipeg jobs table
This part of the workbench file starts with sampling one SHAPE file polygon, it then separates the input data into the streams for the City of Winnipeg and Manitoba because the job information for each of these types is contained in different database tables, then it goes on to use the FeatureMerger transformer to select the record for the job from the appropriate database table and attach all of the database field data to the sampled polygon. Next the Joiner transformer is used to attempt to get a record for this cadastral dataset from the SDI SDE jobs table. If a version of the dataset is already present in the feature class then the Tester transformer sends the processing down below to the update area of the process otherwise it proceeds to the right to the insertion of a new dataset.
The last part of the insertion of a new dataset into the GeoDatabase feature class involves calling the workbench file to do this task using the WorkspaceRunner transformer, if the run was successful then a success message is issued to the log and the cadastral jobs table is updated with the job information using the writer set up for that process.
The screen capture on the right side shows the configuration of this writer with the Advanced – Writer Mode value set to INSERT to allow data to be inserted into the table
This is the area of the workbench file where polygon datasets are input into the feature class as updates. If a previous version (a lower version number) of a dataset is in the feature class then that version is deleted and the new version is inserted.
First the version number of the job in the cadastral jobs table is tested against the version number contained in the sample polygon taken from the input dataset to see if it is an lower version number using the Tester transformer. If this test fails then an error message is generated and the process terminates. If a higher version number of the dataset is found then attributes are created to run SQL statements against the polygon feature class to remove the previous versions of the job. With these SQL attributes set up the workflow branches into the SDE writer that will use them to remove the previous version of the job. This next screen capture shows the SDE writer set up with the the Advanced – Writer Mode value set to DELETE to do the deletion.
The other branch proceeds with the insertion of a new dataset into the GeoDatabase feature class by calling the workbench file to do this task using the WorkspaceRunner transformer, if the run was successful then a success message is issued to the log and the workflow proceeds to make attributes to run SQL statements to update the cadastral jobs table. The next graphic shows the AttributeCreator parameters to make these attributes. Then the workflow proceeds to the SDE writer to perform the update to write the new attribute values to the cadastral job table. This writer is set up in the same way that was shown in the previous slide.
This was developed as a separate workbench file in order to allow tasks that were common to the insertion of cadastral polygons to be only written in one place. There are many fields that are added to the feature class to provide a textual explanation of the attribute where there was just a coded value in the attributes. <CLICK> Here is an example of one of the many calls to the AttributeValueMapper transformer that creates and populates an attribute based on the code contained in another attribute.
During the testing stage the FME was reporting topological problems with self intersecting polygons when I was importing one dataset that I knew had no problems with it. When I contacted SAFE Software support they said that « The problem is most likely due to the coordinate rounding that occurs when features are written to SDE. The coordinate shift due to the rounding can cause small self-intersections in the data”. They suggested using the ArcSDEGridSnapper transformer to simulate the coordinate conversion and the GeometryValidator transformer to fix the coordinate errors. Inserting these transformers and setting their parameters to the values that matched the SDE feature class allowed the geometrical errors to be fixed and the polygons to be inserted into the GeoDatabase feature class
The workbench files to import and update the accurate cadastral mapping and the Manitoba Reference Grid tiles have been tested and they are ready for operation. All nine of the Reference Grid tiles have been imported, four datasets within Winnipeg have been imported and updated and five datasets of the accurate cadastral mapping have been imported and updated. This graphic shows the polygon feature class displayed along with the provincial boundary dataset.