Wednesday, 15 March 2017

Map Crash with error message "Method not found: '!!0[] System.Array.Empty()'"

I had my PC freshly installed and afterwards neither AutoCAD Map 2017 nor 3ds Max 2017 ran. Both crashed with error messages which looked like an .Net issue to me. Our IT department took my PC back and tried to reinstall - but to no avail. Apparently they had to change the harddisk to get both applications to run properly. 

Anyway - Map and Map-Administrator run now but only if I avoid loading a third party plugin. As soon as the plugin loads both programs crash - which doesn't happen on any other PC I tried. The CER report details show the following message:

<InnerException type="System.MissingMethodException"><Message>Methode nicht gefunden: "!!0[] System.Array.Empty()".</Message><StackTrace><Method>

It seems that the plugin was compiled against .net 4.6 but the .net runtime is lower than that. You can find a more detailed explanation here (reply by Alexandru).

That is a bit odd as Map 2017 itself requires .net 4.6 and therefore should not start up or run without 4.6 - but it does. When I checked the version of .net framework under "Installed Programs" it showed 4.6 but also 4.5:



The problem seems to be with language pack and / or registry settings.  Here are the two keys where the values don't match:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full\1031\Version = 4.6.00081
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full\1033\Version = 4.5.51209


On second PC both keys have the same value and under  "Installed Programs" it shows .net 4.6 twice:




Map 2017, SP1

Thursday, 2 March 2017

MapGuide log file

It is good practice to monitor log files. Unfortunately MapGuide log files are not easy to read:
- StackTrace details take up more space then the message itself, it is difficult to see when an error message starts and ends
- certain messages might appear very often although there are not really indicating an issue (such as "Session expired" message).

Here is Python3 script which reads MapGuide logs files and gets rid of StackTrace details and also filters out uninteresting messages. It adds a line number so that you can find the message in the original log file quickly. It also creates a simple summary for messages and their frequency. As we have two MapGuide servers the script is configured to read two different log file folders.  

If you want to use it you need to configure:
- the path for the MapGuide log files
- the path for saving the shortened log file 
- error messages you want to exclude
- the number of most recent log files you want to process (such as the 3 most recent ones)

I haven't done much in Python yet - the script doesn't do much error handling and has other shortcomings as well.

Thats how the result might look like:

****************
**************** Logfile: //wsstadt529/logfiles/MapGuide\Error.log
****************


<start>
3   DATE_TIME 2017-03-02 / 01:50:19
4    Success: Server stopped.

<end>
<start>
DATE_TIME 2017-03-02 / 01:51:23
6    Success: Server started.

<end>
<start>
DATE_TIME 2017-03-02 / 07:27:12
444    Error: An exception occurred in FDO component.
445           Error occurred in Feature Source (Library://FS_BEAR/WT_PO_BW_B/Data/TopobaseDefault.FeatureSource): Zeichenfolge ist kein gültiger Filter.  (Cause: Zeichenfolge ist nicht korrekt formatiert. , Root Cause: Zeichenfolge ist nicht korrekt formatiert. )
446    
<end>

...

<start>
DATE_TIME 2017-03-02 / 13:35:34
953    Error: An exception occurred in FDO component.
954           Error occurred in Feature Source (Library://FS_BEAR/WT_PO_BW_B/Data/TopobaseDefault.FeatureSource): Zeichenfolge ist kein gültiger Filter.  (Cause: Zeichenfolge ist nicht korrekt formatiert. , Root Cause: Zeichenfolge ist nicht korrekt formatiert. )
955    
<end>

*********************************************
************* Summary ***********************
*********************************************

# 3 :: Error: Failed to stylize layer: ZIM_Anlageobjekte_DynaS_GeoRest_MBR
# 36 :: Error: Failed to stylize layer: ZH_Orthofoto2015
# 1 :: Success: Server started.
# 1 :: Success: Server stopped.
# 27 :: Error: An exception occurred in FDO component.

************* File(s) ***********************
File processed: //wsstadt529/logfiles/MapGuide\Error.log
# of messages:68
# of messages excluded: 51

Here is the script:



 import os  
 import sys  
 from collections import Counter  
   
   
 """ extract date and time from logfile line """  
 def extractdatetime(a_text_line):  
   #DateTime Format in MapGuide-LogFile:  
   #<2016-12-15T11:00:34> <2015-07-05T12:39:55>  
   date = a_text_line[1:11]  
   time = a_text_line[12:20]  
   return (date, time)  
   
 """ checks whether a certain error message should be ignored / excluded from further processing """  
 def isErrorMessagesToExclude(a_message):    
   for text_to_find in errormessage_to_exclude:  
     if text_to_find in a_message:  
       return True  
   return False  
   
 """ removes the StackTrace details from an error message """  
 def removeStackTrace(a_message):    
   pos = a_message.find('StackTrace:')  
   if pos > -1:  
     return a_message[0:pos]  
   else:  
     return a_message  
   
 """ saves file """  
 def saveFile(text, filename, mode):  
   f = open(filename, mode)  
   f.write(text)  
   f.close()  
   
   
 """ returns a sorted list of files for a given directory , sorted by date  
   see: http://stackoverflow.com/questions/4500564/directory-listing-based-on-time  
 """    
 def sorted_ls(path):  
   mtime = lambda f: os.stat(os.path.join(path, f)).st_mtime  
   result = list(sorted(os.listdir(path), key=mtime))    
   return result  
     
 """ keeps only MapGuide Logfiles and filters out all other files """  
 def filterLogfiles(files_in_dir):  
   filtered_files = []  
   for filename in files_in_dir:  
     if filename.endswith(".log") and filename.startswith("Error"):   
       filtered_files.append(filename)  
   return filtered_files  
         
 """ processes a single MG log file and simplifies content """  
 def processLogFile(logfile):    
   # counter for line number in log file  
   line_number = 0  
   # counter for number of messages processed  
   counter_messages = 0  
   # counter for messages we exclude from processing  
   counter_messages_ingnored = 0  
   # internal counter  
   count_opening_tag = 0    
   message = ''  
   message_part = ''  
   # encoding utf-8, ascii : returns error message while reading a certain character/bytes  
   with open(logfile, encoding='“latin-1') as a_file:  
     for a_line in a_file:                    
       line_number += 1        
       # all messages start with '<' and first line also contains date&time  
       if '<' in a_line:  
         count_opening_tag +=1  
         #get date and time  
         str_date, str_time = extractdatetime(a_line)  
         a_line = 'DATE_TIME ' + str_date + ' / ' + str_time + '\n'                  
       # we processing the first line of the current message  
       if count_opening_tag == 1:  
         # line number and date/time information in one new line  
         message_part += str(line_number) + '  ' + a_line          
       # we have reached the first line of the following message - now we need tp process the previous message       
       if count_opening_tag == 2:          
         #first we check whether the message can be ignored  
         if isErrorMessagesToExclude(message_part) is False:  
           # we remove the StackTrace details  
           message_part = removeStackTrace(message_part)            
           counter_messages += 1  
           # we wrap the message text in <start><end> tags  
           message += '\n<start>\n'  
           # we add the processed message to the result  
           message += message_part  
           message += '\n<end>'  
         else:  
           counter_messages_ingnored += 1  
         #as this is the first line of the "next" message already a_line contains the DATE_TIME for it  
         message_part = a_line  
         # reset counter - the current line is the first line of the next message  
         count_opening_tag = 1  
     # last line reached - last message block is not yet fully processed  
     # code from above is repeated here to close the processing of last message in logfile  
     if isErrorMessagesToExclude(message_part) is False:  
       message_part = removeStackTrace(message_part)                  
       message += '\n<start>\n'  
       message += message_part  
       message += '\n<end>'    
       counter_messages += 1  
     else:  
       counter_messages_ingnored += 1  
     temp = ["File processed: "+logfile, "# of messages:"+str(counter_messages), "# of messages excluded:\t "+str(counter_messages_ingnored) ]  
     summary_files.append(temp)      
     print("File processed: "+logfile)  
     print("# of messages:"+str(counter_messages))    
     print("# of messages excluded:\t "+str(counter_messages_ingnored))      
     return message  
   
 """ converts the newly created log file(s) summary into a list """  
 def convertToList(processedlogfile):  
   with open(processedlogfile, encoding='“latin-1') as a_file:  
     error_message = False  
     list_final = []  
     date_temp =''  
     time_temp = ''  
     error1_temp = ''  
     error2_temp = ''  
     line_counter = 0  
     for a_line in a_file:        
       line_counter += 1  
       if '<start>' in a_line:          
         line_counter = 1   
         error_message = True  
       if '<end>' in a_line:  
         error_message = False  
         line_counter = 0  
         list_temp = [date_temp, time_temp, error1_temp, error2_temp]  
         list_final.append(list_temp)  
       if error_message:           
         if line_counter == 2:  
           date_temp = a_line[10:20]  
           time_temp = a_line[23:]  
         if line_counter == 3:  
           error1_temp = a_line[5:].strip()  
         if line_counter == 4:  
           error2_temp = a_line[5:].strip()  
   return list_final  
   
 """ iterates over all relevant files and creates summary"""  
 def processLogfiles(logfile_dir, saveLogFileName, number_of_most_recent_files):     
   # create a new file for the results  
   saveFile('', saveLogFileName, 'w')    
   # get all files from directory with MapGuide logs  
   files_in_directory = sorted_ls(logfile_dir)    
   # filter out all non MapGuide log files  
   files_in_directory = filterLogfiles(files_in_directory)  
   # only process the most recent files   
   if(number_of_most_recent_files > 0):    
     index = ((number_of_most_recent_files ) * -1)  
     files_in_directory = files_in_directory[index:]    
   # interate over relevant log files  
   for filename in files_in_directory:      
       fn = os.path.join(logfile_dir, filename)  
       # process single log file  
       log_file_short = processLogFile(fn)  
       # create header      
       header = "\n\n****************"  
       header += "\n**************** Logfile: " + fn  
       header += "\n****************\n\n"        
       # write to file  
       saveFile(header, saveLogFileName, 'a')  
       saveFile(log_file_short, saveLogFileName, 'a')  
   # all files have been processed and relevant information has been written into one file          
   # now we want to get a summary of logged issues  
   resultList = convertToList(saveLogFileName)      
   '''  
   resultList is list of lists, lists have 4 items each   
   items 3 and 4 are equal to line 1 and 2 of a message    
   now we just count item 3 and get a Dictionary where frequency of message is key and message itself is value    
   '''    
   res = (Counter(mysublist[2] for mysublist in resultList))  
   text = "\n\n*********************************************"  
   text += "\n************* Summary ***********************"  
   text += "\n*********************************************\n\n"  
   for message, number in res.items():  
     text += "# " + str(number)+ "\t :: " + message +'\n'    
   # append summary    
   saveFile(text, saveLogFileName, 'a')    
   summ = "\n************* File(s) ***********************\n"  
   for alist in summary_files:  
     summ += "\n".join(alist)+"\n"  
   # append summary    
   saveFile(summ, saveLogFileName, 'a')    
   
     
 if __name__ == ("__main__"):  
     
     
   """ add any message you want to ignore when processing the log files"""  
   errormessage_to_exclude = [  
                 'Session has expired',   
                 'Resource was not found: Session:',   
                 'Die Sitzung (',   
                 'Error: Authentication failed'                  
                 ]    
   # number of most recent log files to process  
   # 0 - for all files to be processed  
   # can be overwritten when python script is called with parameter  
   number_most_recent_files = 3  
                   
   #if argument is provided we assume its a number  
   if len (sys.argv) == 2 :  
     number_most_recent_files = int(sys.argv[1])  
     
   # path to MapGuide error log directory       
   logfile_dir1 = '//wsstadt529/logfiles/MapGuide'    
   logfile_dir2 = '//wsstadt516/logfiles/MapGuide'    
   # file name for result - simplified log file:  
   saveLogFileName1 = 'c:/temp/aims_log_processed_529.txt'  
   saveLogFileName2 = 'c:/temp/aims_log_processed_516.txt'  
     
   # start processing  
   summary_files = []  
   processLogfiles(logfile_dir1, saveLogFileName1, number_most_recent_files)        
   summary_files = []  
   processLogfiles(logfile_dir2, saveLogFileName2, number_most_recent_files)      
     
   # open file(s) in Editor  
   os.startfile(saveLogFileName1)  
   os.startfile(saveLogFileName2)