2017 Esri Developer Summit – Day Two Recap

2017 Esri Developer Summit – Day Two Recap

2017 Esri Developer Summit – Day Two Recap

Published March 9, 2017 by Kaitlyn Thomas

Patrick Scanlon

Today ESRI further clarified their strategy for scalable, distributed server-side architecture.  Mark Skinner from Nvidia spoke about how their CUDA-based hardware is helping all kinds of organizations run distributed processing.  Mansour Raad and Adam Mollenkopf showed off using WebGL to create a 3D mesh representing millions of records of data and how that data changes over time (you can read more about that here).  Finally, Adam Mollenkopf and Suzanne Foss demonstrated a practical approach to horizontally scaling GeoEvent Server, GeoAnalytics Server and Spatiotemporal Big Data Store using Apache Mesos and DC/OS.

One thing that was a great help to my personal day-to-day operations were the sessions on Web AppBuilder best practices.  Here’s a quick summary (the slides for the talk are here):

  1. The “libs” directory is there for you to store shared resources and utilities.  It is also handy for facilitating cross-widget communication; simply define functions that your publishing widget can call and raise events that your subscribing widgets can listen to.
  2. Keep your code outside the WAB folders and use grunt-watch or gulp-watch to copy your code into the WAB whenever you change it.
  3. Use a generator rather than creating WAB widgets from scratch.
  4. Write your widget using vanilla JSAPI and use the WAB widget to wrap it.  This ensures higher portability as well as ease of testing.
  5. Test your code outside the WAB
  6. Use the build tool when preparing to deploy to production.

Finally, this talk by Rene Rubalcava made me smile: How to integrate the ArcGIS JS API into other JS frameworks.  He covers Angular, Ember, React, Vue.js and Elm.  Definitely check out the slides for that talk here, there are code snippets for each framework.  There are additional resources on the topic can be found here.

Heather Roberts

We started off the day with a Keynote presentation from PubNub with Realtime Communication API and the streaming web.  Afterwards, most of my sessions have centered around the 10.5 ArcGIS Enterprise with some sessions in Web AppBuilder and Python. Below are some highlights from my sessions today.

The ArcGIS Python API can be used to script your Web GIS.  This includes Enterprise Integration such as Users, Roles, and Group management, Location Analytics using Imagery and Spatial Analysis, publishing content and for Big Data using Raster Analytics and Feature Analytics.  Python can also be used in Pro to share web tools in Pro as you would GP services in ArcMap.  The tools can be shared directly from Pro, or scripted to publish multiple services or to multiple servers.  Some considerations are python can be data greedy and is consolidated when published to the server. After publishing, it’s referenced in memory, so try to avoid absolute and relative paths when possible.

Web AppBuilder Developer edition in Enterprise deployments can be registered as an item in your Portal.  You can then share the application with your organization or export as a template for your organization to use.  You also have the option to disable the embedded Web AppBuilder in Portal so that you can use just the apps you want to use that have custom themes or widgets.

There are some considerations for best practices in ArcGIS Enterprise Performance and Scalability.  Mostly you need to consider CPU and RAM which becomes increasingly important as you increase the number of services. It’s important to also pay attention to slow services which typically result from publishing practices.

With ArcGIS Enterprise Security, it is strongly recommended to enable and use https, as browser support for http is diminishing and https is now required for geolocation over Chrome.  It’s also a consideration to disable the services directory on production if there is concern over scanner attacks.  With the new Portal to Portal Collaboration, you may want to consider only sharing service items, which are just a reference to your data, rather than data items, unless your intent is distributive data.

 Joel Brown

The ESRI JavaScript 3x API kind of has an attitude problem.  It’s not that its impolite, it’s just that it is very opinionated in its use of the Dojo JavaScript framework.  Anyone who has tried to theme the API and its assortment of API provided “dijits” is forced to delve into the world of Dojo whether they like it or not.  With the recent explosion of JavaScript frameworks and tools over the past few years it can be a real bummer to limit yourself like this.  That is why I was pleasantly surprised to see the 4x API take a more unopinionated approach.  The 4x API still uses Dojo but the use of it is less conspicuous and pretty much non-existent in the presentation layer.   I attended several sessions today that highlighted these improvements.

Perhaps the biggest improvement in this area is that widgets are no longer implemented as Dojo djits.   In fact, the widgets are now implemented using separate view and view model classes.  The default view pattern is inspired by Maquette JS and uses JSX but you can swap in your own implementation if you prefer something else.  Decoupling the view from the view model lets you use your own presentation layer without having to touch the widget behavior logic which is encapsulated in the view model class provided by the API.

Removing the dependency on Dojo digits also removes the need to account for Dojo related CSS when cooking up a custom theme or style for your app.  The default CSS at 4x is completely revamped and now follows the BEM naming convention.  This CSS pattern is designed to make CSS more readable, modular, and easier to reason about.  This is great for the use case where you just need to tweak the default CSS by overriding a few selectors.  If you need to do heavy duty theming, the SASS source code for the 4x API CSS is available as well.

Dan Huber

Day two of the Esri DevSummit started out with a great keynote from Todd Greene, CEO and founder of PubNub.   He covered the why and how we’ve become an always connected, instant response expecting society, and outlined the best ways to meet those consumer goals.   Having developed solutions in the past that required access and aggregation of real-time information, it was refreshing to see a company focused on providing 0.25 second worldwide latency with a five 9 update promise.   I am definitely going to become one of their clients/users.   Bringing it back to GIS, Esri is providing a PubNub “BLOCK” –  a customizable microservices for developers to use in the environment – that access their Geocoding services, with the goal of providing similar BLOCKs for Routing, Base Maps, and GeoEnrichment.    With all buzz about IoT going around lately, Todd was definitely a great choice for the keynote at this year’s event.

After the keynote, it was time to start focusing on the reason I’m attending this event – the sessions.  First on the schedule was an update on the new’ish ArcGIS Python API.   This was covered by Rohit Singh, the lead developer on the project and definitely their chief evangelist.  I am really looking forward to working with this library as it continues to offer the most functionality of all the ArcGIS libraries and provides the best responsive development environment when coupled with Jupyter notebooks.   I hope the other library maintainers at Esri start following the pattern this team is providing – or better yet, move their efforts into this one.

The second session was hosted by Bill Major and Cherry Lin and covered the work they are doing with the Chef deployment recipes for ArcGIS.   Maintaining a simple and repeatable deployment process has been a goal of mine for the past few years, and it looks like Esri is putting a good amount of effort in supporting this in their products.   I was a bit disappointed that they only gave a brief mention of what will be expected to be included with 10.5.1 to support automated deployments, so I guess I must wait for the UC to find out.

The ArcGIS Enterprise Security team gave a great presentation on the best practices for securing your ArcGIS Servers and Portals, complete with meaningful demonstrations on how the settings work and what goes wrong when you don’t follow them.   And if you haven’t checked them out yet, both Server and Portal provide python scripts that an admin can run to scan their systems to see if they are following these best practices.

The one session that disappointed me was the Performance and Scalability Best Practices talk provided by Andrew Sakowicz and Frank Pizzi.   While I enjoyed hearing Andrew’s anecdotes of ‘deployments gone wrong’, and appreciate the insight he provides from all his experience, I wish they would have not kept saying that their System Monitor tool suite was the best option to utilize.  The tool is only available through a Professional Services engagement, which means a lot of clients do not have access to it.

Things I learned:

  • Setting up Portal to Portal Collaboration only really works if both Portals utilize the same identity service. This limitation is fine for some of our clients, but not many of our Federal customers.  The issue lies with accessing the services – if the Portal you share with cannot access the services, the items you share will not have any value.
  • Configuring a relational database connection in Insights is not easy – and pretty much impossible if you have them admin services disabled on your ArcGIS Server’s web adaptor.   Scripting the setup and configuration may be the answer.

Stephanie Lindley

Test, baby, test 123!

Automated testing, something everyone talks about, but how often do we actually do it?

We all know how important testing is and how, the further along in development we get, the more expensive testing gets.

Do we as a company, TA’s, developer’s, SE’s and anaylst’s give testing a fighting chance? Do we implement unit testing upfront or include unit testing in our level of efforts? Do we know what the ROI is to do so? Or the problems it may alleviate?

I learned about visual unit testing with Spectre, where you take a picture before and after and using machine learning and artificial intelligence, it will spot the changes, giving you a visual representation of pass or fail. I thought was pretty cool and an 100% relevant.

Then they mentioned other software such as Selenium, WebDriver, RSpec, Cucumber (I was drinking this in my water).  Seems to be this is something we need to explore and implement in our applications, as the benefits outweigh the cost.

Dan Levine

Well, I spent much of my time getting smarter on the new Esri stack supporting Big Data/IoT and I gotta say I am impressed.  They have come a long way in the past few years. Recall Mansour in a demo theatre 3 years ago flying through the technology stack (at least 7 or 8 different solutions) to get to some sort of big data visualization or analysis.  Three weeks later, the same talk would have switched out 2 or 3 of those. The technologies were succeeding and failing fast.  Now the stack has settled and it feels like Mansour’s straight man and equally big brained colleague, Adam Mollenkopf, has been leading the implementation of a truly enterprise stack that can support the consumption, visualization, and analysis of just about any volume of data.  Adam introduced Project Trinity which was the technology in a configuration that allows for truly massive scaling.  It is truly mind boggling what the potential is for this as it becomes a product/service.

One other technology near and dear to my heart is VR and AR.  I was stoked to hear today that Esri Labs is on a rapid pace to push their ArcGIS 360VR to a product with an accelerated road map to continue enhancements and broaden the products for authoring and viewing results.  Further excited to see the investment into Augmented Reality.  I really think it will be common place in the near future to see field workers on the street using AR to streamline their work flows.

Share this:

Leave a Reply