Wednesday 14 December 2016

Reset AutoCAD profile - saved settings can't be unzipped?

I just had to reset an AutoCAD Map 2017 user profile. Before Map resets the profile you can save your existing settings. The settings are saved as ZIP file.
Bit disappointing though that I couldn't unzip the archive afterwards - Window 7 build in "unzipper" informed me that the archive was damaged and therefore couldn't be extracted. But don't give up - just use an alternative app such as 7-zip. It did unzip the file without any complaints,

Map 2017, SP1, HF3

Map 2017 - DWG export - crash with long symbol names

Map 2017 seems to crash if symbol names are too long and you export too many features of that class (export option: editable). But it might be more complicated as I couldn't reproduce this with all data sets.

After we updated to Map 2017 we got crashes when exporting certain areas using certain Display Models. Further testing unveiled, that the crashes were linked to a specific point layer (part of displaying electrical cross sections). Then we found out, that we could export cross sections - but not all of them at the same time. Lets say we had three cross sections - they could all be exported one by one - but not all together.  The layer involved comprises of more then 20 thematic rules and we used lots of dynamic expressions (for instance symbol size is calculated for each feature based on attribute  <ParameterValue>1.09375*SCALE</ParameterValue>). It took quite some time to figure out that neither the symbol itself nor any of the settings for the symbol caused the crash. At one point a new layer was created from scratch and one of the symbols from the original layer was imported from the drawing into StyleEditor again. No other settings were modified - and Map DWG Export crashed. So the conclusion was it has to be related to the symbol - or to be more precise - to the graphical representation of the symbol. At least that was what I thought. But why should this symbol crash Map? It wasn't a complex one, it was like many other used in the display model which did not cause any crash. It was very confusing - how can you find a workaround if your explanation for the issue doesn't make any sense? I thought I need to test every symbol setting step by step to find exactly the point where it causes Map to crash. So I created a  new layer from scratch - that is the default layer Maps creates when the feature class is added to the DisplayManager. I first added the symbol name, did a test export and - voilà  -  I got the crash.

So, here is a symbol name which will cause Map to crash if I export more then a few features "BWEWR2501_B35". And yes, these long symbol names mainly occur in the layer for cross section symbols.

As a workaround we shortened the symbol names in the layer file (which is easy to do) but we also had to modify the block names in our export template (which is more time consuming). With the workaround in place we still get crashes when exporting if XDATA is included in the export. I think it took  us  more then 4 days of testing and eventually applying the workaround. - and we are not sure it will cover all eventualities.... This issue still occurs with Map SP1 and HF3.


Map 2017, S1, HF3

Monday 12 December 2016

AutoCAD Map 2017 - features with arcs

Contrary to what I wrote just minutes ago - there is no workaround for the issue regarding modifying a feature with arcs. Even worse - first I thought the issue only occurs if you convert a polyline with arcs into a Map (FDO) feature. But the same issue occurs if a feature is digitized in Map or even for an existing feature with arcs.

Description: if you move a vertex then Map moves the position of a different vertex instead. This happens if the feature has arcs. It does not happen with all vertices - I haven't figured out which ones are affected and which ones are not affected. Here is an example:


Map 2017, SP1, HF3

AutoCAD Map 2017 Update

It was time to update to a new release and end of November - after  testing - we rolled out Map 2017. Updating the Industry Models in our test environment went well. Functional testing of Map and AIMS did not unveil any major issues. But after we updated our production environment we started to see issues coming in:

AIMS / MapGuide

- GeoRest is very slow - generating an image takes 30 secs
   Insufficient testing: we use Georest to embed maps in Crystal Reports. Our reports are already quite slow. I only tested a simple report  whether the report shows up including the map. The report contains only one map. I did not check any other reports containing more maps - otherwise I would have noticed that reports are not generated anymore / map placeholders are empty.   
   
- error messages we haven't seen before - causing instability and sometimes crashes
(1)    Error occurred in Feature Source (Library://FS_INFO/WT_IM_I/Data/TopobaseDefault.FeatureSource): Ausnahmefehler in FDO-Komponente aufgetreten.
ORA-12537: TNS: Verbindung beendet ( connection closed)
(2)  Error: Keine weiteren Verbindungen zum FDO-Provider von Autodesk.Topobase konnten erstellt werden. (couldn't create any further connections to Autodesk.Topobase provider)

bit more difficult to find out during testing. I didnt do any stress tests and the issues we see only occur under load. I changed a few settings for AIMS server and we also changed settings in Oracle - it appears that the error messages occur less frequent now. I will wait a few weeks and then check how often AIMS automatically reboots after a crash and then compare that to numbers for AIMS 2013.

Map 

- DWG export caused crashes and even after installing HF3 we have crashes in certain situations
  Export as DWG is very important to us - so we put some time in testing the  DWG export. But it seems we did not test enough. Although the crashes only occur in certain situations these situations could have been identified with proper testing. Fortunately Autodesk published Hotfix 3 shortly after we had rolled out Map 2017 - two issue we ran into have been solved now. For a third one - a crash - we had to find our own workaround (in short: if the symbol name in the layer file is too long it can crash Map if you export a certain number of features).

- converting polyline with arcs to IM feature causes unexpected behavior when feature is modified (moving vertexes)
  bad luck - you cannot test everything and there is a workaround.
  
 So overall - proper testing costs time but it will pay off.

Map 2017, Sp1, HF3

Wednesday 23 November 2016

MapGuide / AIMS - GETDYNAMICMAPOVERLAYIMAGE and GETMAPIMAGE

We want to use a MapGuide map in an OpenLayers based web mapping application. Whilst checking the two options for calling MapGuide I noticed the differences between GETMAPIMAGE and GETDYNAMICMAPOVERLAYIMAGE.

GETDYNAMICMAPOVERLAYIMAGE
- creates a transparent background
- requires a session id

GETMAPIMAGE
- doesn't support transparent background but map background color can be set transparent <BackgroundColor>00ffffff</BackgroundColor>
- allows username/password or session id

We tried WMS as well but MapGuide (AIMS 2013) as WMS server is not great - mainly perfomance is not sufficient. Will do another performance with AIMS 2017 soon.

Monday 19 September 2016

AIMS 2017 - new error message

After updating from AIMS 2013 to AIMS 2017 no raster images were shown. Any preview in Studio triggered the following error message in the log file:


<2016-09-07T08:33:35> 3752 Ajax Viewer 10.99.66.114 Administrator
Error: Stilisierung von folgendem Layer fehlgeschlagen: Luftbild_1pro
Ausnahmefehler in FDO-Komponente aufgetreten.
Ausnahmefehler in FDO-Komponente aufgetreten.
'FdoATILSession::GetImage': Funktion kann aufgrund eines ungültigen Wertes für Eingabeparameter 'fileDescriptor' nicht ausgeführt werden. (Ursache: , grundlegende Ursache: 'FdoATILSession::GetImage': Funktion kann aufgrund eines ungültigen Wertes für Eingabeparameter 'fileDescriptor' nicht ausgeführt werden. )
StackTrace:
- MgMappingUtil.StylizeLayers() Zeile 899 Datei e:\build\2017\re_ims2017_46_1-6376417\ent\os\server\src\services\mapping\MappingUtil.cpp  



This message was new to me and hadn't occurred with AIMS 2013.

Also, "Testing connection" always returned "OK" for our raster file data source. The raster file data source uses an ALIAS which points to a shared network drive.

Problem was, that AIMS 2017 was running under local user account. In order to access the network drive AIMS service needs to run under different user account. 

AIMS 2017, SP1

Monday 12 September 2016

Oracle ANSI JOIN syntax improves peformance

Using ANSI JOIN syntax instead of Oracle Join snytax is recommended - see here:

http://www.orafaq.com/node/2618

We found out the hard way:

- a third party application issues simple but nested SQL statements against Map IM views
- for some views we noticed that the query ran for a long time and generated a time out 
- just by changing the JOIN syntax to ANSI join the issue was resolved

Map 2017

Friday 9 September 2016

AIMS GeoRest - Update 2013 to 2017

I installed AIMS 2017 on our test server. To get Georest extension running again the following steps were required:

(1) change path for virtual folder "rest" to ..Autodesk\Autodesk Infrastructure Web Server Extension 2017\www\GeoREST\bin
(2) re-create Wildcard Module Mapping for GeoRest_ISAPI.dll
(3) copy all GeoRest configuration files from old installation into ..Autodesk Infrastructure Web Server Extension 2017\www\GeoREST\conf
(4) change Port number in conf files (old port number: 2801, new port number: 2811)

Restart IIS, test, done.

AIMS 2017,

Tuesday 23 August 2016

Python - update georeferencing information in multiple files

You need to install GDAL Python bindings first.

https://pythongisandstuff.wordpress.com/2011/07/07/installing-gdal-and-ogr-for-python-on-windows/

Here is a basic script as template:

 import os  
 import fnmatch  
 from datetime import date  
 from osgeo import gdal  
 from gdalconst import *  
 # logfile-path  
 logfile = "c:/temp/geotifheader_update.log"  
 #path to GeoTIFs  
 tif_filefolder = "S:/MG_DATEN/Rasterdaten"  
 #shift for all images - we just add a fixed shift to all images   
 shift_x = 2000000.950  
 shift_y = 999999.800  
 # get all TIF files in folder, including subfolders  
 def list_files(dir):  
   r = []  
   for root, dirs, files in os.walk(dir):  
     #for name in files:  
     for name in fnmatch.filter(files, "*.tif"):  
       r.append(os.path.join(root, name))  
   return r  
 #write message to log file  
 def writelog(message):  
   with open(logfile, 'a') as myfile:  
     myfile.write(message)  
 #init log file  
 def initlog():  
   with open(logfile, 'a') as myfile:  
     myfile.write(str(date.today())+"\n")  
 #update georeferencing information for each image  
 def setHeader(filepath):  
   dataset = gdal.OpenEx(filepath, GA_Update)  
   geotransform = dataset.GetGeoTransform()  
   if not geotransform is None:  
     print "**file**: ", filepath  
     print 'Origin = (', geotransform[0], ',', geotransform[3], ')'  
     print 'Pixel Size = (', geotransform[1], ',', geotransform[5], ')'  
     ulx = geotransform[0];  
     uly = geotransform[3];  
     #simple check whether image should be updated or not  
     if (ulx > 1000000 or ulx == 0 or uly > 1000000 or uly == 0):  
       m = "current insertion point out of bounds: " + str(ulx) + "," + str(uly)+"\n"  
       print m  
       writelog(m)  
     else:  
       ulx = ulx + shift_x  
       uly = uly + shift_y  
       lrx = ulx + (dataset.RasterXSize * geotransform[1])  
       lry = uly + (dataset.RasterYSize * geotransform[5])  
       gt = [ulx, (lrx - ulx) / dataset.RasterXSize, 0, uly, 0, (lry - uly) / dataset.RasterYSize]  
       dataset.SetGeoTransform(gt)  
       m = "image has been updated.\n"  
       print m  
       writelog(m)  
   else:  
     m = "not a geotiff.\n "  
     print m  
     writelog(m)  
 files = list_files(tif_filefolder)  
 initlog()  
 for file in files:  
   writelog("-----------------------\n")  
   writelog(file+"\n")  
   setHeader(file)  
   writelog("-----------------------\n")  
 '''  
 for single file:  
 setHeader("S:/GIS/DTMAV/Relief/relief_dtmav30.tif")  
 '''  
 # https://gis.uster.ch/dokumentation/datenkonvertierung/gdal-tools  
 # set path=C:\Program Files\GDAL;C:\Python27  

Thursday 18 August 2016

MapGuide/AIMS - modifying maps and layers as XML

Often I export MapGuide map definitions or layers as XML in order to add or remove content. Most often it is faster than using Studio. The modified XML file is uploaded with MapAgent and "SetResource". Specifically map definitions contain hundreds of layers and often layers need to be added or removed. I just realized that instead of deleting layers and uploading the map definition I can just put the layers in questions within XML comment tags and upload the file again. In case I have made a mistake and "deleted" a layer which I didn't want to delete I can go back to my XML file, remove the comment tags and upload the file again. Unfortunately the comments are filtered out when uploading - so when you save the map definition again as XML file from Studio all comments are gone. 
By the way - when you put comments into Map layer definition files and open and save the DisplayModel with Map the comments are retained.




 </MapLayer>  
 -->  
  <MapLayer>  
  <Name>GR_Baum</Name>  
  <ResourceId>Library://FS_INFO/WT_GR_I/Layers/GR_Baumkataster/GR_Baum.LayerDefinition</ResourceId>  
  <Selectable>false</Selectable>  
  <ShowInLegend>true</ShowInLegend>  
  <LegendLabel>GR_Baum</LegendLabel>  
  <ExpandInLegend>false</ExpandInLegend>  
  <Visible>true</Visible>  
  <Group>GR_Baumkataster</Group>  
  </MapLayer>  
  <!--  
  <MapLayer>  
  <Name>SW_Projektperimeter</Name>  
  <ResourceId>Library://WMS_IN/IS/Layers/SW_Projektperimeter.LayerDefinition</ResourceId>  
  <Selectable>false</Selectable>  
  <ShowInLegend>true</ShowInLegend>  
  <LegendLabel>SW_Projektperimeter</LegendLabel>  
  <ExpandInLegend>false</ExpandInLegend>  
  <Visible>false</Visible>  
  <Group>Leitungskataster</Group>  
  </MapLayer>  
  -->  
  <MapLayer>  

Saturday 13 August 2016

Lost in translation - even worse....

Just created network deployment for Map 2017 and noticed that ReCap description hasn't changed much (see screenshot). If you do not speak german than you really miss out on a ridiculous translation. 

ReCap - "Realitätserfassung in einer Vorbereitungsumgebung" - good luck with that!
link


Wednesday 10 August 2016

Spatial Queries and Map

Once in while you have CAD or GIS data and you need to link features but there are no attributes to do that. Map's spatial functions such as MapOverlay cannot always be used to establish a spatial relationship. Here are two examples:

#1
"I have a field of points ... What I would like is to be able to analyse the neighbours within the footprint (1 meter) of each recording point to determine the range elevation differences. So for instance, the circled point in the snip, has two neighbours and the deviations in elevation from the point being analysed are +0.007 and -0.010m. I would like to somehow record these values. Some points won't have any close neighbours so they would have no records against them."

#2
There are 8000 weld joints along a pipe. Weld joints are marked by means of a block including some attributes. Next to the pipe are 800 drill points. They are also marked by means of a block including certain attributes. The task was find the closest drill point for each weld joint. The distance between weld joints and drill point is not uniform.

I don't know whether you could solve the tasks with out-of-the-box functionality in QGIS or ArcGIS. But as both support Python it is probably not so difficult to find the answers writing a few lines of code. Unfortunately Map doesn't offer any useful scripting - VBA and Lisp only support a subset of the Map API.

If you need to solves tasks like the ones above and you do not have anything else at hand than Map then could try your luck with SQLite. 

SQLite itself is a file based database. There are tools freely available to connect to SQLite files and to perform SQL queries. There is also an extension for spatial data which means you can perform spatial queries. Map can import and export SQLite Spatial which allows you to run spatial queries against Map data without having to install a full blown (spatial) database system such as Oracle Spatial, SQL Server or PostGIS.

When working with spatial data in SQLite you can use Spatialite_GUI:

Download:
http://www.gaia-gis.it/gaia-sins/
http://www.gaia-gis.it/gaia-sins/windows-bin-x86/spatialite_gui-4.3.0a-win-x86.7z

Unzip file using 7-zip and execute the program "spatialite_gui.exe" - no installation required. 

I'm going to describe how to export to SQLite and how to perform a simple spatial query by taking the data from example 1 above. The drawing contains CIVIL objects which I exported to SDF using CIVIL command EXPORTTOSDF. I then imported the SDF into Map (command: mapimport) including the attribute data. As starting point I got a drawing containing plain AutoCAD points with attribute data attached. 

1. open drawing file

2. check the AC points - they have Map object data attached, each point has a point number and an elevation value: 

AutoCAD points and some attribte data attached (Map object data table)

3. export points to SQLite Spatial file, command: mapexport

make sure that you export your points as Point feature class (not as Point+Line+Polygon), also export the attached attributes, you can also set a name for the table the features will be exported to:

MAPEXPORT - export options

   
4. run SpatialLiteGUI   

5. connect to SQLite file

You should see the following message:
FDO-OGR detected;activating FDO-OGR auto-wrapping...
- Virtual table: tablename
...


Spatialite_Gui message about Map/FDO geometries


Spatialite_gui supports "native" SQLite spatial geometries as well as Map (FDO) geometries. The message tells you, that Map/FDO geometries were found in the database and that Spatialite_GUI has created a wrapper around those geometries by means of a "virtual table". With that mechanism in place Map/FDO geometries can be used as they were "native" SpatiaLite geometries. 

6. the application shows all tables found in the SQLite file, you will not only see the table containing the points you just exported but also some further tables created by Map containing some Map specific metadata. Below "User data" you will see at least two tables. Table number one will have the same name as given in Mapexport dialog box, the second table will have the same name but with the prefix "fdo". The latter one is also marked with "chain" symbol - meaning it is "virtual table". Both tables have basically the same content - but only the second table can be used for spatial queries:


Spatialite_GUI
Perform a quick test and type the following SQL statement into the input area and then execute the query:

select * from mypoints

the result will look like that:

1 BLOB sz=32 UNKNOWN type -2.638000 5522
2 BLOB sz=32 UNKNOWN type -2.887000 5523
3 BLOB sz=32 UNKNOWN type -2.930000 5524


Query result
The GEOM column contains the geometry of the feature - but the content of the geometry column is just shown as "BLOB" and marked as "unknown" - the geometry in the original table is stored in a way SQL can not access it.

If you do the same against the second table the result will look slightly different:

select * from fdo_mypoints

1 BLOB sz=60 GEOMETRY -2.638000 5522
2 BLOB sz=60 GEOMETRY -2.887000 5523
3 BLOB sz=60 GEOMETRY -2.930000 5524

It is still only shown as "BLOB" but SQL also recognises that those "BLOBS" store some kind of Geometry. 

To access the geometry and to apply spatial queries you have to use the second table.
Here is another example - show the X and Y coordinates of the points:

select number, ST_X(geom) as X, ST_Y(geom) as Y from fdo_mypoints

5522 15950702.370320 2537019.106847
5523 15950700.846391 2537017.708832
5524 15950699.357462 2537016.129788

If you execute the SQL statement against the other table you won't get any useful result:

select number, ST_X(geom) as X, ST_Y(geom) as Y from mypoints

5522 NULL NULL
5523 NULL NULL
5524 NULL NULL

7. Before we can perform the spatial query we need to copy our (virtual) table. I don't know why but when I run the SQL statement (as shown in next step) against my fdo_mypoints table I get a wrong result (too little rows are returned - it seems the cross join doesn't work on the virtual table). To copy the table the following SQL statement needs to be executed:

CREATE TABLE new_table_name AS SELECT * FROM existing_table_name

CREATE TABLE fdo_mypoints_copied AS SELECT * FROM fdo_mypoints

Afterwards you need to refresh the tree view (context menu for root node >> "Refresh") in order to see the newly created table.

7. To link points within distance of 1m  we can use the following statement:

select 
p1.number as p1_num, 
p2.number as p2_num,
p1.elevation as p1_ele,
p2.elevation as p2_ele,
ST_Distance(p1.geom, p2.geom) as dist_poi,
p1.elevation-p2.elevation as diff_ele,
p1.geom as p1_geom
from fdo_mypoints_copied as p1 cross join fdo_mypoints_copied as p2
where 
p1.number <> p2.number
and
dist_poi < 1
order by p1_num

As all points are in one table we need a cross join to process a point against all other points in the same table. We then filter out all points which are equal (p1.number <> p2.number) and where the distance between points is >= 1.

Now we can check the result - if the query returns roughly the number of rows we would expect - then we are ready to export the result. Just right-click anywhere in the table view choose "Export ResultSet --> as Shapefile".

Keep in mind: column names in SHP file are limited to 10 characters, your select statement should set columns names accordingly.

8. Add SHP file to Map
When you open the table view for the SHP file in Map and you click on one of the (numeric) columns you might get an error message (data type 'FdoDataType_Int64' not supported) and the table view gets blank. Just right-click on layer in Display Manager and choose "Export layer data as SDF", then remove SHP layer from drawing and add SDF file instead.


It's not that complicated, is it? Some knowledge of SQL and spatial queries is required though. There are also some peculiarities with SQLite / SQLite-Spatial but as soon as you aware of them the whole process can be done in a few minutes. 



Saturday 30 July 2016

Why you should restart AIMS/MapGuide when testing ....

I noticed by chance that a few layers - which are all based on one Oracle view - were significantly slower in our production system but no performance issue occurred in our test environment. Whilst investigating the issue I made a few mistakes which made the whole process of finding and fixing the issue more time consuming than necessary. Here are the details:

After noticing the issue I asked my colleagues if they had any idea why this is and whether anything had changed recently. None of them had an explanation so I spent some time checking / testing:

- running performance reports in MapGuide/AIMS noticing that time spent on "Layer" is 20times slower in production
- comparing view definitions but only cursorily (looking at WHERE clause only) : mistake #1
- re-creating spatial index for base table : no change
- running 1-click maintenance in Map-Administrator : no change
- copy a layer definition from production to test environment : no change
- trying to load view into other applications to see whether the issue is related to application or to view but GeoRaptor couldn't add layer to map due to rowid issue, QGIS rejected view complaining that view didn't have a proper spatial index and TatukGIS Viewer basically saying something similar (first the message about the spatial index issue made me re-re-create the spatial index but testing further views in QGIS/TatukGISViewer I got the same messages for all other views as well).
- running a few spatial SQL queries in SQLDeveloper but there wasn't any significant difference between test and production : mistake # 2 (this result caused quite some confusion as I did not have an explanation, it did not fit and even contradicted other findings!)

After spending some time on running these tests a colleague came back to me and told me, that the view had changed recently but when he applied the change he didn't notice any performance issue whilst generating graphics in Map. The change was made by adding a subquery to the SELECT clause. He asked if the change should be removed but I replied that I will do some tests now taking the new information into account : mistake # 3.

As the change to the view definition had only been applied in the production system I applied the change in the test system as well - here mistake # 4 occurred. I then ran the MapGuide/AIMS performance report again but the performance had not decreased. Performance was good as before. Then I changed the data source of the Oracle user I was testing with and pointed it to our production database. When I ran the test again performance was slow. I stopped any further testing for the day and did something else.

At this point my theory was that the same query has different execution plans in Oracle (production and test) therefore different execution times due to different Oracle settings. What didn't really fit was that when I ran the spatial queries in SQLDeveloper I didn't see any significant difference in performance. But I didn't have anything else to test so finally I decided to replace the view definition in production with the previous view definition - basically removing the change which had been applied recently. I re-ran the performance report in MapGuide/AIMS and suddenly I got an error message:

Failed to stylize layer: DP_xxx
An exception occurred in FDO component.
ORA-00904: "G"."OWNER_OBJTYP": invalid identifier
...

Now I realised one of my mistakes. MapGuide/AIMS caches certain settings/definitions. I don't know how it works in detail but probably roughly like that:
- when AIMS/MapGuide starts it reads the Oracle schema and caches some information
- (SELECT) statements are being build based on the cached information and used against Oracle

For a view MapGuide/AIMS seems to read the view definition and then creates a SELECT statement which includes all columns of the view. 
If the view definition in Oracle changes and new columns are added to the view MapGuide can still run the SELECT statement against the new view definition - basically just ignoring all newly added columns.  But when columns have been removed from the view the SELECT statement still includes those columns and Oracle will response with ora-00904.

Mistake 4 - when I applied the change to view definition in the test system I did not restart MapGuide/AIMS. MapGuide was still using the SELECT statement which did not include the newly added column in the view. But exactly that new column had a negative impact on performance.
If I had restarted MapGuide/AIMS at this point in time I could have avoided further testing and head-scratching.

Mistake 3 - if I had listened to my colleague who offered to remove the change in production I might have noticed the cause of the problem earlier (because even without restarting MapGuide/AIMS we would have seen the Ora-00904 message in the log file).

Mistake 2 - my spatial query was simplified, the select clause included only "fid, geom". As it did not include the column which caused the performance issue there is really no surprise that performance in both production and test were nearly the same. Conclusion: always use (if possible) exactly the same (SELECT) statement as the application when you do testing!

Mistake 1 - I should have noticed the difference in view definitions myself just by copying the view definitions from production and test into an editor and then noticing the different numbers of lines....

So at the end a trivial issue but it took longer then necessary to get it fixed.

Why you should restart AIMS/MapGuide when testing ....

I noticed by chance that a few layers - which are all based on one Oracle view - were significantly slower in our production system but no performance issue occurred in our test environment. Whilst investigating the issue I made a few mistakes which made the whole process of finding and fixing the issue more time consuming than necessary. Here are the details:

After noticing the issue I asked my colleagues if they had any idea why this is and whether anything had changed recently. None of them had an explanation so I spent some time checking / testing:

- running performance reports in MapGuide/AIMS noticing that time spent on "Layer" is 20times slower in production
- comparing view definitions but only cursorily (looking at WHERE clause only) : mistake #1
- re-creating spatial index for base table : no change
- running 1-click maintenance in Map-Administrator : no change
- copy a layer definition from production to test environment : no change
- trying to load view into other applications to see whether the issue is related to application or to view but GeoRaptor couldn't add layer to map due to rowid issue, QGIS rejected view complaining that view didn't have a proper spatial index and TatukGIS Viewer basically saying something similar (first the message about the spatial index issue made me re-re-create the spatial index but testing further views in QGIS/TatukGISViewer I got the same messages for all other views as well).
- running a few spatial SQL queries in SQLDeveloper but there wasn't any significant difference between test and production : mistake # 2

After spending some time on running these tests a colleague came back to me and told me, that the view had changed recently but when he applied the change he didn't notice any performance issue whilst generating graphics in Map. The change was made by adding a subquery to the SELECT clause. He asked if the change should be removed but I replied that I will do some tests now taking the new information into account : mistake # 3.

As the change to the view definition had only been applied in the production system I applied the change in the test system as well - here mistake # 4 occurred. I then ran the MapGuide/AIMS performance report again but the performance had not decreased. Performance was good as before. Then I changed the data source of the Oracle user I was testing with and pointed it to our production database. When I ran the test again performance was slow. I stopped any further testing for the day and did something else.

At this point my theory was that the same query has different execution plans in Oracle (production and test) therefore different execution times due to different Oracle settings. What didn't really fit was that when I ran the spatial queries in SQLDeveloper I didn't see any significant difference in performance. But I didn't have anything else to test so finally I decided to replace the view definition in production with the previous view definition - basically removing the change which had been applied recently. I re-ran the performance report in MapGuide/AIMS and suddenly I got an error message:

Failed to stylize layer: DP_xxx
An exception occurred in FDO component.
ORA-00904: "G"."OWNER_OBJTYP": invalid identifier
...

Now I realised one of my mistakes. MapGuide/AIMS caches certain settings/definitions. I don't know how it works in detail but probably roughly like that:
- when AIMS/MapGuide starts it reads the Oracle schema and caches some information
- (SELECT) statements are being build based on the cached information and used against Oracle

For a view MapGuide/AIMS seems to read the view definition and then creates a SELECT statement which includes all columns of the view. 
If the view definition in Oracle changes and new columns are added to the view MapGuide can still run the SELECT statement against the new view definition - basically just ignoring all newly added columns.  But when columns have been removed from the view the SELECT statement still includes those columns and Oracle will response with ora-00904.

Mistake 4 - when I applied the change to view definition in the test system I did not restart MapGuide/AIMS. MapGuide was still using the SELECT statement which did not include the newly added column in the view. But exactly that new column had a negative impact on performance.
If I had restarted MapGuide/AIMS at this point in time I could have avoided further testing and head-scratching.

Mistake 3 - if I had listened to my colleague who offered to remove the change in production I might have noticed the cause of the problem earlier (because even without restarting MapGuide/AIMS we would have seen the Ora-00904 message in the log file).

Mistake 2 - my spatial query was simplified, the select clause included only "fid, geom". As it did not include the column which caused the performance issue there is really no surprise that performance in both production and test were nearly the same. Conclusion: always use (if possible) exactly the same (SELECT) statement as the application when you do testing!

Mistake 1 - I should have noticed the difference in view definitions myself just by copying the view definitions from production and test into an editor and then noticing the different numbers of lines....

So at the end a trivial issue but it took longer then necessary to get it fixed.

Wednesday 27 July 2016

AutoCAD Map 2017, job enabled IMs and Oracle 12.1.0.2.0

Map 2017 ReadMe states:

  • Oracle 12cR1 (12.1.0.2.0) is not recommended for Industry Models with jobs. You should use other versions of Oracle like 11.2.0.4.0 or 12.1.0.1.0 to work with jobs.


This issue with Jobs and 12.1.0.2.0 has been fixed now by Autodesk.  Autodesk  has published a TS regarding the solution.


Friday 15 July 2016

Publish to MapGuide/AIMS - Spatial Filter II

As described in a previous posting  when you publish data from Map to MapGuide/AIMS the published layer files get a spatial filter assigned. The filter will decrease performance and needs to be removed. So usually we remove the spatial filters after publishing from Map.

But today we had user reporting to us that some layers don't appear on the map although they used to appear in the past.

As it turned out spatial filters had been assigned to those layers but the filters covered our area of interest and therefore no one noticed it. But a few weeks ago we switched to a new coordinate system and had to transform our spatial data into the new system. The spatial filters defined in some layers had not been transformed as we had not been aware of them and assuming all had been removed. After removing the spatial filters the layers re-appeared on our maps.

The issue with regards to layers getting a spatial filter assigned when being published has been reported to Autodesk some time back. Maybe it has been fixed by now in newer releases.

Here is the core bit of the PHP based script for removing spatial filters - as usual a quick and dirty solution:



 if ($_POST['hiddenparam'] == '') die();       
      if ($_POST['projektpfad'] == '') die();       
      $MGAdminPassword = $_POST['password'];  
      try  
      {                      
           include './config.schema.list.php';  
           include '../vaw.lib/initializeMapGuide.php';  
           $Verzeichnis = $_POST['projektpfad'];             
           echo "<html><body>";  
           echo "<p>MapGuide Pfad: $Verzeichnis </p>";  
           $VerzeichnisResourceId = new MgResourceIdentifier($Verzeichnis);  
           $LayerReader = $resourceService->EnumerateResources($VerzeichnisResourceId, -1, "");  
           $Xml = $LayerReader->ToString();  
           $LDomDoc = DOMDocument::loadXML($Xml);  
           $pfad = new DOMXPath($LDomDoc);  
           $frage = "//ResourceDocument/ResourceId";  
           $ebenen = $pfad->query($frage);  
           $anzlayer = 0;  
      foreach($ebenen as $knoten)  
      {  
           $Layer = $knoten->nodeValue;  
           $LayerResourceId = new MgResourceIdentifier($Layer);  
           $layerName = $LayerResourceId->GetName();  
           $layerType = $LayerResourceId->GetResourceType();  
           if($layerType == 'LayerDefinition')  
           {  
                echo "<b>Layername: $layerName </b><br>";  
                $tmpReader = $resourceService->GetResourceContent($LayerResourceId);  
                $layerXML = $tmpReader->ToString();                 
                $doc = DOMDocument::loadXML($layerXML);                            
                $FilterNode = $doc->getElementsByTagName('Filter');  
                foreach($FilterNode as $node)   
                {  
                     if(!stristr($node->nodeValue, "GeomFromText"))  
                     {                      
                          //kein Spatial Filter  
                          //echo " <b>Kein Spatial Filter.</b><br>";  
                     }  
                     else  
                     {  
                          $node->nodeValue = '';  
                          echo "<b>Spatial Filter - vorhanden und entfernt.</b> <br><br>";  
                          echo "<small>vorhandene Filter: {$node->nodeValue} .</small>";  
                     }  
                }                           
                //echo $doc->ToString();  
                $modifiedLayerDefinition = $doc->saveXML();  
                $byteSource = new MgByteSource($modifiedLayerDefinition, strlen($modifiedLayerDefinition));  
                $byteSource->SetMimeType(MgMimeType::Xml);  
                //$resourceId = new MgResourceIdentifier($LayerResourceId);  
                $resourceService->SetResource($LayerResourceId, $byteSource->GetReader(), null);  
           }  
      }  
 }  
 catch ( MgException $e )  
   {  
     $errorMsg = $errAuthenticationFailed;  
   }  
   catch ( Exception $e )  
   {  
     $errorMsg = $e->getMessage();  
   }  

Map 2013, SP2

Thursday 7 July 2016

Map API - MgFeatureReader - improve speed

The developer example for reading features with MgFeatureReader looks like that:


 while (featureReader.ReadNext())  
    {  
      featureCount++;  
      propertyCount = featureReader.GetPropertyCount();  
      for (int i = 0; i < propertyCount; i++)  
      {  
        propertyName = featureReader.GetPropertyName(i);  
        propertyType = featureReader.GetPropertyType(propertyName);  
        PrintPropertyValueFromReader(featureReader, propertyType, propertyName, ref sb);  
      }  
    }  

(form MapGuide Api Reference).

At least with Map this will be slow. Performance can be improved significantly if  the GetPropertyName and GetPropertyType calls are removed from within the FOR loop.

Example: a small application in order to displays feature attributes in a DataGrid. 

DataGrid

The sample SDF file contains 110'000 features and has 11 columns (text and numbers).

Using code similar to the one given above it takes 19 sec to populate and display the datagrid.

By removing the two calls the process takes less then 1 second and is even slightly faster then opening  Map's data table (mapdatatable):


Befehl: MAPDATATABLE2
 Finished in 0 seconds.
Befehl: MAPDATATABLE3

 Finished in 19 seconds.


All it takes is to get property name and property type for each column only once and store this information in an array (colProps) before iterating over the FeatureReader:

 ...  
  while (featureReader.ReadNext())  
       {  
         counter++;  
         newRow = dt.NewRow();          
         for (int i = 0; i < colProps.Length; i++)  
         {  
           propertyName = colProps[i].PropertyName;  
           propertyType = colProps[i].PropertyType;  
                          ....  


Map 2013, SP2


Revisited: Jobs in AutoCAD Map - job-disable a feature class

Just to let you know - the "solution" for having job-disabled feature classes in a job enabled document as described here works well for us. We have been using the solution for nearly a year in production and haven't had any issues.

Map 2013, SP2

Map IM - generating graphics and DXF files

Recently we encountered two issues regarding DXF files and generating graphics. Both only happen if you enable "Reuse Active Drawing" in "Generate Graphics" application options.

1) Display Model is loaded into wrong drawing

If you have opened one or more drawings (DWG) and one DXF file and you generate graphics over the DXF file: the display model layers wont load into the DXF file (as expected) but in one of the other currently opened drawings.
Solution: close all other drawings or save DXF as Drawing

2) error message when generating graphics

An error message is displayed and the Display Model doesn't load when generating graphics over a DXF file. Error message shown:

Drawing of Display Model in Map failed. Adding layer failed: layername
Failed to retrieve message for "MgLayerNotFoundException".
The resource was not found.

This happens only if the DXF has been saved with a coordinate system. 
Solution: save DXF as Drawing file.

Both issues have been confirmed by Autodesk for Map 2017.

Tuesday 19 April 2016

MsMpEng.exe Slowing down MapGuide

The blog title is not entirely correct - MapGuide/AIMS is not directly affected, but - here are the details:

Suddenly (Monday morning) users started to complain that our AIMS based WebGIS is particularly slow. As it turned out it was not AIMS server itself but the application around it which was slow. Opening a form  or performing a search suddenly took seconds,  but map creation was still fast as usual. We do not use AIMS's Industry Model but TBVIEW1  as an better performing alternative. 

We have three servers in use - the issue occurred only on one of them. TaskManager showed some unusual activity on the machine affected - it appeared as process MsMpEng.Exe was running nearly constantly consuming 25% of CPU power. Further investigation showed, that MsMpEng got active as soon as the WebGIS application interacted with Oracle, for instance when opening a form and remained active for some time. No other processes were using CPU power at the same time. For testing purpose our IT disabled MsMpEng.exe temporarily and the performance returned back to normal. They are now investigating why this happens and why it happens only on one of three machines (W2008R2). 

AIMS 2013, SP2

Monday 18 April 2016

Draw AutoCAD Polyline using Oracle Spatial Geometry

I found an interesting blog entry about drawing a polyline (straight line segments only) in AutoCAD by using Oracle Spatial Geometry. 

A quick test showed, that the SQL statement doesn't seem to be correct. It also uses an undocumented Oracle command which is not available anymore in 12c (according to this website).

I changed the SQL script using LISTAGG - which is only available in 11.2 and newer:


 -- only required if decimal separator is not a dot:  
 ALTER SESSION SET NLS_NUMERIC_CHARACTERS = '. ';  
   
 -- creates list of coordinates which can be easily copied and pasted into AutoCAD  
   
 SELECT  c.fid,   
 '_pline ' || LISTAGG(t.X || ',' || t.Y, ' ') WITHIN GROUP (order by t.id) as pline  
 FROM ww_line c, -- your table name  
 TABLE(SDO_UTIL.GETVERTICES(c.geom)) t -- your spatial column name  
 WHERE c.fid = 138900 -- your ID  
 GROUP BY c.fid;  

To finish the "pline" command you press "ESC". I don't know how to have something similar when you copy and paste the command into AutoCAD so I couldn't add it to the SQL result set.

1. run SQL script
2. copy result
3. paste result into AutoCAD
4. press "ESC" to finish pline command


Line in Oracle Spatial ( preview with Georaptor)

Line in AutoCAD

Map 2013, SP2

Tuesday 29 March 2016

Plot : FDO hatch issue with rotated viewports

As mentioned previously, there is an issue when plotting FDO hatches in a rotated viewport.

It seems to occur with different hatch patterns (except solid) and with different plot devices (DWG to PDF, HP DesignJet, DWF). Here is an example - in some areas the hatch is missing:





As we found out, not all plot devices seem to have the problem. We are able to plot our layouts without this particular issue using Adobe PDF writer. 

Map 2013, SP2

Monday 14 March 2016

ECW file header

There is a tool by Intergraph/ERDAS for modifying ECW headers called "ECW Header Editor". If you need to apply a new insertion point you can just open the ECW file, enter the new coordinates and save the changes back to the file.



Command line version is available as well. The tool is part of a package called "Apollo Essentials".

Current download URL : 

http://download.intergraph.com/downloads/erdas-apollo-essentials---image-web-server-2011-utilities-version-11.0.4


Wednesday 9 March 2016

Create hard links for multiple files in batch process

If you need to create hard links for multiple files in one folder you can use the following batch script:

@ECHO OFF

set source_path="C:\Data\Hansa\Befliegung\Mosaic_rgb\"
set source_filter="*.tif"
set target_path="C:\raster\2013\"

for  %%i in (%source_path%%source_filter%) do (mklink /h %target_path%%%~nxi %source_path%%%~nxi)

pause

The script needs to be run with Administrator privileges.