So if you have a script that calls arcpy.AddField_management() without defining Python as the expression type its not going to run if background processing is enabled...
What I learned today
If I don't write it down I might forget it.
Monday, November 5, 2012
It looks like after installing ArcGIS 10.1 SP1, if you have background processing enabled for geoprocessing, you cannot use a VB expression type for the calculate fields tool. Even though VB is still the default expression type.
So if you have a script that calls arcpy.AddField_management() without defining Python as the expression type its not going to run if background processing is enabled...
So if you have a script that calls arcpy.AddField_management() without defining Python as the expression type its not going to run if background processing is enabled...
Tuesday, January 24, 2012
Sort of a Game
An extension is now available to make writing code a game.
Here is the link
You get "Badges" when you compile code and VS finds stuff.
It could be interesting. I should give it a try when I get back into C# work.
Here is the link
You get "Badges" when you compile code and VS finds stuff.
It could be interesting. I should give it a try when I get back into C# work.
Friday, January 20, 2012
ArcGIS Toolbox: Loading script tool parameter defaults from a config file.
Lets say we create a script tool to run in ArcGIS Desktop and it has 10 parameters. On average the user only needs to change one of those parameters to run the tool 98% of the time.
We make each parameter available because hard coded parameters can kill your tool 2% of the time. Or maybe more like 10%.... Whatever.
Its best to have all the parameters exposed in the tool dialog and loaded with the defaults your going to use 98% of the time. If these defaults are the same across many tools you don't want to be going into each tool's parameters to set the defaults. You want one location to set all the defaults. That's your config file bro.
So here is an example of one tool's tool validator that loads the defaults from a config file located in the same directory as the toolbox we are running the tool from.
import sys
import os
import ConfigParser
import arcpy
# Finds the path of the toolbox
scriptPath = sys.path[0]
# The config file should be in the same directory as the toolbox
configFile = scriptPath + "\\amp_config.cfg"
class ToolValidator:
"""Class for validating a tool's parameter values and controlling
the behavior of the tool's dialog."""
def __init__(self):
"""Setup arcpy and the list of tool parameters."""
self.params = arcpy.GetParameterInfo()
#Group the following parameters in the dialog box
self.params[2].category = "Defaults"
self.params[3].category = "Defaults"
self.params[4].category = "Defaults"
self.params[5].category = "Defaults"
self.params[6].category = "Defaults"
self.params[7].category = "Defaults"
def initializeParameters(self):
"""Refine the properties of a tool's parameters. This method is
called when the tool is opened."""
return
def updateParameters(self):
"""Modify the values and properties of parameters before internal
validation is performed. This method is called whenever a parmater
has been changed."""
# If the config file is not found the parameters remain blank.
if not os.path.exists(configFile):
self.params[2].value = ""
self.params[3].value = ""
self.params[4].value = ""
self.params[5].value = ""
self.params[6].value = ""
self.params[7].value = ""
return
#If the config file is found it moves on to populate the parameters.
# Adds defalut values from the config file when the first paramter is filled for the first time.
if self.params[0].altered and not self.params[0].hasBeenValidated:
if not self.params[0].value == None:
# Loads the config file into memory
config = ConfigParser.SafeConfigParser()
config.readfp(open(configFile))
# Gets the values from the config file.
scratchDir = config.get('environment','scratchDir')
coverageName = os.path.basename(str(self.params[0].value))
logFile = scratchDir + "\\" + coverageName + "_log.txt"
self.params[3].value = logFile
self.params[4].value = coverageName
self.params[6].value = scratchDir + "\\" + coverageName + "_cov.gdb"
self.params[2].value = scratchDir
self.params[5].value = config.get('dataConversion','coverageList')
self.params[7].value = config.get('dataConversion','annotationLayerList')
# If parameter 1 is false then parameter 7 is greyed out.
if self.params[1].value == False:
self.params[7].enabled = False
else:
self.params[7].enabled = True
return
def updateMessages(self):
"""Modify the messages created by internal validation for each tool
parameter. This method is called after internal validation."""
return
We make each parameter available because hard coded parameters can kill your tool 2% of the time. Or maybe more like 10%.... Whatever.
Its best to have all the parameters exposed in the tool dialog and loaded with the defaults your going to use 98% of the time. If these defaults are the same across many tools you don't want to be going into each tool's parameters to set the defaults. You want one location to set all the defaults. That's your config file bro.
So here is an example of one tool's tool validator that loads the defaults from a config file located in the same directory as the toolbox we are running the tool from.
import sys
import os
import ConfigParser
import arcpy
# Finds the path of the toolbox
scriptPath = sys.path[0]
# The config file should be in the same directory as the toolbox
configFile = scriptPath + "\\amp_config.cfg"
class ToolValidator:
"""Class for validating a tool's parameter values and controlling
the behavior of the tool's dialog."""
def __init__(self):
"""Setup arcpy and the list of tool parameters."""
self.params = arcpy.GetParameterInfo()
#Group the following parameters in the dialog box
self.params[2].category = "Defaults"
self.params[3].category = "Defaults"
self.params[4].category = "Defaults"
self.params[5].category = "Defaults"
self.params[6].category = "Defaults"
self.params[7].category = "Defaults"
def initializeParameters(self):
"""Refine the properties of a tool's parameters. This method is
called when the tool is opened."""
return
def updateParameters(self):
"""Modify the values and properties of parameters before internal
validation is performed. This method is called whenever a parmater
has been changed."""
# If the config file is not found the parameters remain blank.
if not os.path.exists(configFile):
self.params[2].value = ""
self.params[3].value = ""
self.params[4].value = ""
self.params[5].value = ""
self.params[6].value = ""
self.params[7].value = ""
return
#If the config file is found it moves on to populate the parameters.
# Adds defalut values from the config file when the first paramter is filled for the first time.
if self.params[0].altered and not self.params[0].hasBeenValidated:
if not self.params[0].value == None:
# Loads the config file into memory
config = ConfigParser.SafeConfigParser()
config.readfp(open(configFile))
# Gets the values from the config file.
scratchDir = config.get('environment','scratchDir')
coverageName = os.path.basename(str(self.params[0].value))
logFile = scratchDir + "\\" + coverageName + "_log.txt"
self.params[3].value = logFile
self.params[4].value = coverageName
self.params[6].value = scratchDir + "\\" + coverageName + "_cov.gdb"
self.params[2].value = scratchDir
self.params[5].value = config.get('dataConversion','coverageList')
self.params[7].value = config.get('dataConversion','annotationLayerList')
# If parameter 1 is false then parameter 7 is greyed out.
if self.params[1].value == False:
self.params[7].enabled = False
else:
self.params[7].enabled = True
return
def updateMessages(self):
"""Modify the messages created by internal validation for each tool
parameter. This method is called after internal validation."""
return
Thursday, December 29, 2011
ArcGIS Project Management with Production Mapping
ESRI's Production Mapping has a lot of useful tools and functionality. I think its kind of like that drawer in your house where you put things that are really great for something but not necessarily needed for anything.
For me the documentation is rarely clear or concise enough to understand so I just have to tinker with it until I find something that works.
Recently I dove into the Product Library. I am working on a project where the product will be a workflow to create maps. The workflow includes some script tools but the goal is to use as much out of the box functionality as possible. Anyways, the development process requires setting up MXDs, tool boxes, scripts with config files, layout rules, styles, and just about anything else GIS related. It turns out the Product Library is a great way to sync all these things between team members.
I have not really figured the structure yet but after setting up a "Solution" and a "Product Class" we are able to check in and out everything to a versioned SDE database. If we choose to check everything out to the same location on our respective machines the MXDs, toolboxes and what not stay working.
I'm just saying its neat that's all.
For me the documentation is rarely clear or concise enough to understand so I just have to tinker with it until I find something that works.
Recently I dove into the Product Library. I am working on a project where the product will be a workflow to create maps. The workflow includes some script tools but the goal is to use as much out of the box functionality as possible. Anyways, the development process requires setting up MXDs, tool boxes, scripts with config files, layout rules, styles, and just about anything else GIS related. It turns out the Product Library is a great way to sync all these things between team members.
I have not really figured the structure yet but after setting up a "Solution" and a "Product Class" we are able to check in and out everything to a versioned SDE database. If we choose to check everything out to the same location on our respective machines the MXDs, toolboxes and what not stay working.
I'm just saying its neat that's all.
Wednesday, November 30, 2011
Converting Decimal Degrees to Degrees Minutes Seconds with Python
Thanks to this post at Another GIS Blog I was saved the time of having to figure out how to do this calculation on my own.
And I just the author posted that TODAY, just 8 hours ago. That is crazy.
Thanks Guy.
And I just the author posted that TODAY, just 8 hours ago. That is crazy.
Thanks Guy.
Tuesday, November 29, 2011
Tricky Data driven pages
I was having a problem with a data driven pages script where I needed to use the pageRow property to access an attribute. The only examples I found required hard coding the field name but after a little Googleing I found someones suggestion to use:
fieldName = arcpy.GetParameterAsText(1)
mxd.dataDrivenPages.pageRow.getValue(fieldName)
This brings in the field name from the input.
fieldName = arcpy.GetParameterAsText(1)
mxd.dataDrivenPages.pageRow.getValue(fieldName)
This brings in the field name from the input.
Monday, October 10, 2011
i5 2500k and ASUS P8P67 R3 1155 not boot, not posting, nothing...
I was having problems with my machine, which I just built about 3 months ago. I would hit the power button and the power light on the case would give a quick flash and the CPU and GPU fans would give about one rotation then nothing. If I hit the power button again nothing would happen. So I would turn the PSU off for a few seconds then try again. This would get me the same results 5-10 times until I would get lucky and the machine would start up. Once up the machine worked great. No problems at all for how ever long I had it running. But after turning it off it was back to the same dance to get it going again.
Everything in the build was 3 months old with the exception of my OCZ 600w power supply and my hard drive.. After trying about everything I could find on the google machine I decided it must be the power supply.
After picking up a new PSU and testing it I found that was the issue. I have had the new PSU in for a few days now and done a bunch of cold starts and no problems...
I am disappointed that my OCZ psu died. It was a nice PSU and only 3 years old. I always kept it clean too...
Oh well.
Everything in the build was 3 months old with the exception of my OCZ 600w power supply and my hard drive.. After trying about everything I could find on the google machine I decided it must be the power supply.
After picking up a new PSU and testing it I found that was the issue. I have had the new PSU in for a few days now and done a bunch of cold starts and no problems...
I am disappointed that my OCZ psu died. It was a nice PSU and only 3 years old. I always kept it clean too...
Oh well.
Subscribe to:
Posts (Atom)