Main Drupal Feed

Subscribe to Main Drupal Feed feed - aggregated feeds in category Planet Drupal
Updated: 1 hour 48 min ago

Mobomo: Is Acquia the Right Platform Choice?

Thu, 02/27/2020 - 14:21

You’re having trouble keeping up with demand and need a more powerful and robust website platform.

As business problems go, that’s a great one to have. Especially for enterprise-grade organizations and government entities. The question is: Which website platform is best?

To help you make informed decisions about your platform choice, we’re sharing a look at what Acquia has to offer. In this post, you’ll learn what Acquia is and how it works, who should consider using the platform and who should not. Then you’ll read our thoughts on what should be top of mind when selecting a platform.

Full disclosure: Mobomo is an Acquia partner organization, meaning we help clients make the most of their Acquia technology and services. Far from being a hard sell, however, this post aims solely to provide expert analysis and an honest assessment of the company and its products.

What Acquia Is and How It Works

Acquia is considered a digital experience platform (DXP), which is a collection or suite of products that work in concert to manage and optimize the user’s digital experience. These products can include a CRM, analytics, commerce applications, content management and more.

In its industry report on DXPs, Magic Quadrant for Digital Experience Platforms, Gartner defines a digital experience platform as “an integrated set of core technologies that support the composition, management, delivery and optimization of contextualized digital experiences…Leaders have ample ability to support a variety of DXP use cases and consistently meet customers’ needs over substantial periods. Leaders have delivered significant product innovation in pursuit of DXP requirements and have been successful in selling to new customers across industries.”

Organizations use DXPs to build, deploy and improve websites, portals, mobile and other digital experiences. They combine and coordinate applications, including content management, search and navigation, personalization, integration and aggregation, collaboration, workflow, analytics, mobile and multichannel support.

Acquia is one of the major players in this space, and the only one designed solely for Drupal.

Acquia co-founder Dries Buytaert was in graduate school in 2000 when he created the first Drupal content management framework. Buytaert and Jay Batson then established Acquia in 2007 to provide infrastructure, support and services to enterprise organizations that use Drupal.

Features and Benefits of Acquia

Acquia initially offered managed cloud hosting and fine-tuned services for Drupal. It has since expanded on its Drupal foundation to offer a complete DXP, including but not limited to:

  • Acquia Cloud: Provides Drupal hosting, development tools, hosting services and enterprise grade security.
  • Acquia Lightning: An open source Drupal 8 distribution with preselected modules and configuration to help developers build sites and run them on Acquia Cloud.
  • Acquia Digital Asset Management: A cloud-based digital asset management tool and central library for Drupal sites.
  • Acquia Commerce Manager: Provides a secure and flexible platform for content-rich experiential commerce.
  • Mautic: A marketing automation platform that enables organizations to send and personalize multi-channel communications at scale.
  • Acquia Journey: An omnichannel tool that allows marketers to listen and learn from customers to craft a sequence of personalized touchpoints and trigger what they will see next.

Additionally, Acquia provides comprehensive logging, performance metrics, security and Drupal application insights, and uptime alerts organizations need to monitor and optimize applications.

The Acquia platform also shines in its security capabilities, supporting strict compliance programs such as FedRAMP, HIPAA, and PCI, among others. Acquia customers can also internally manage teams at scale with advanced teams and permissions capabilities.

And they’re running with the big dogs. Other DXP companies assessed in the Gartner Magic Quadrants report include Adobe, IBM, Salesforce, Liferay, SAP, Adobe, Microsoft and Oracle.

In that report, Gartner cited Acquia’s key strengths as follows:

  • Acquia Experience Cloud offers a wide array of capabilities well-suited to support the B2C use case. Some clients also use it for B2B and B2E use cases.
  • The open-source community behind Acquia, which is the main contributor to the underlying Drupal WCM system, is highly active and well-supported by the vendor.
  • Acquia’s partner ecosystem continues to grow, offering choices to clients looking for expertise in specific verticals and availability in specific regions.
Who Should Consider Acquia

In a nutshell, Acquia is a good fit for enterprise-grade clients and government entities needing a comprehensive and powerful platform that optimizes the entire user experience while integrating data from multiple sources to support decision-making. Organizations that deploy and manage multiple websites will find Acquia particularly helpful.

One glance at Acquia’s customer page crystalizes the scope and scale of organizations they serve. Brands using Acquia include Wendy’s, ConAgra Brands, University of Virginia, City of Rancho Cucamonga in California and Australia’s Department of the Environment and Energy.

According to Website Planet, what sets Acquia apart is their foundation in the open-source Drupal content management framework. Unlike many of their competitors, Acquia allows customers to buy resources and features individually rather than purchasing entire pre-made packages. This can be particularly appealing to organizations who already have a couple of strong individual solutions in place that they want to integrate into their DXP, such as this reviewer in the manufacturing industry:

“A few things drove me to this solution: Decoupled architecture that allowed me to build a completely distributed digital landscape while keeping central control, The Open Platform concept that allowed me to build my own integrations and connect different components of my existing Martech stack without always using the “default” provided options and the comfort/security of relying on a cloud-based solution with full service support on top.

For e-commerce website owners, Acquia’s packages provide a PCI DSS compliant solution that can easily scale to accommodate extensive product catalogs, large transaction volumes and surges in traffic. Acquia’s proprietary e-commerce manager integrates the various content, commerceand user interfaces, allowing you to provide seamless experiences to your customers through a single system.”

Who Should Not Consider Acquia

Acquia is best suited for organizations with both the need for such a powerful suite of tools and the development expertise to easily implement and manage it. Beginners and small businesses lacking the requisite knowledge of programming and Drupal are likely better off with a different provider.

For those who develop their website through an agency, you’ll want to double-check that they will provide developers experienced with Drupal 8. If you do develop in-house, make sure your developers have strong familiarity with it.

Additionally, Acquia’s power comes at a price: Its price point may put it out of reach for small-to-medium businesses.

Acquia: Our Takeaway

As with any other significant investment, the best choice for your organization boils down to your wants and needs of you, the consumer. Keep these points in mind assessing how well Acquia matches up with your master list of must-haves.

  • Determine your desired business outcome. Think about what you’re after in terms of improving the business. What does each DXP offer and can you make the most of every feature you’re paying for?
  • Know your stack. Document your current technology architecture: what do you have, who uses it, for what and how is it connected?
  • Determine use cases. Who will use your technology and how will it make them productive?
  • Prepare your people. Your personnel play a massive role in assembling your digital experience technology stack. Don’t set yourself up to spend time and money on a platform that doesn’t get adopted or used to its potential.

By conducting a thorough assessment of your organization’s needs, capabilities, and goals, you can readily determine whether Acquia is the best fit to help you provide an amazing digital experience for your audience.

Contact us today and find out how Mobomo can help you make the most of Acquia.

The post Is Acquia the Right Platform Choice? appeared first on .

Tag1 Consulting: What the future holds for decoupled Drupal - part 2

Thu, 02/27/2020 - 14:14
As another decade comes to a close in Drupal’s long and storied history, one important trend has the potential to present amazing — or unsettling, depending on how you look at it — opportunities and shifts in the Drupal ecosystem. That trend is decoupled Drupal.Read more preston Thu, 02/27/2020 - 14:19

Mediacurrent: Drupal 8 Feeds Import with External JSON API

Thu, 02/27/2020 - 13:00

There are many resources online around using the Feeds module in Drupal 8 to import data from a 3rd party API. Most of the resources, however, are specific to XML and RSS feeds. In this blog post, I wanted to document my journey of how I was able to get Feeds to import into Drupal 8 from a 3rd party API that provides JSON data.

About the 3rd Party API

The 3rd party API I chose to use for the purposes of this blog post is about one of my favorite TV shows that you’ve probably heard of, Breaking Bad.

The Breaking Bad API offers a few different endpoints, but I decided to use the Characters endpoint as the source that will be imported as Drupal Character nodes.

It appears the API just returns an array of JSON objects, which obviously are Breaking Bad characters and various attributes about the characters.

This API is free and open to use for the public and is great for testing purposes. Under normal circumstances and real projects, APIs will typically be locked down and only accessible with an authenticated user. Authentication with Feeds is outside the scope of this post.

Initial Drupal Setup

This assumes you have a running Composer-based Drupal installation already. I will not be covering how to do that, but will say for this example, I used the Drupal 8 Quickstart provided by DDev-Local for my local environment setup.

Feeds module

First of all, we need to install the Feeds module. This will be the module doing most of the work and processing to import the items from the JSON API into Drupal.

Feeds Extensible Parsers module

Because the Breaking Bad API returns to us a JSON response, we need a JSON parsing system in Drupal. We can use the Feeds Extensible Parsers module to parse the JSON.

We can simply run composer require drupal/feeds_ex, and Composer will make sure all dependency modules and other libraries get included, such as Feeds module and the JsonPath library.

Note, because I’m using DDev-Local for my local environment, I needed to run:
ddev composer require drupal/feeds_ex

Pro tip: You’ll likely also want to include this Feeds patch:

The patch addresses an issue where even if the feed import is unsuccessful, it requires the API URL be slightly changed to get around some caching that’s happening behind the scenes. How I solved it before this patch was to append a query string parameter to the API URL each time when I wanted to run the import, such as:

Go ahead and install the Feeds and Feeds Extensible Parsers modules on the module list page. Next, let’s create the Character content type we will use to import the Characters from the API into within Drupal.

Character Content Type

I created a new content type, Character, because, well, that makes sense for the type of content Drupal will be ingesting from the external API. We also need to create the fields for each of the pieces of character data that will be migrated into Drupal.

To keep it simple, I’m mainly dealing with simple string fields and mappings. The Title field actually isn’t shown here, but is a required field on the node type. I’ve also added fields for Actor and Image path, which will pull from portrayed and img respectively in the API.

For more complex data sources, you may need to use the Feeds Tamper module to massage the imported data to fit your field structures, but that is outside the scope of this post.

Feeds Configuration

Next, let’s configure Feeds to be able to ingest the 3rd party JSON, parse it, and map it to the Drupal fields during the import process.

Feed Type

Simply add a new Feed Type. I called mine Breaking Bad Characters. Mainly, we’ll want to use the Download from url Fetcher, and the JsonPath Parser. We’ll also need to configure where we need the imported content to go. So, in this case, we can choose Node for processor, and Character for content type.

I’ve left all other settings defaulted, except for the one under Processor setting, and changed the Update existing contents setting to Update existing content. This controls what happens when the API data changes and how Drupal should handle the existing content the next time the feed import happens. It will be important during the mapping step to make sure we have a unique identifier for each feed, which is how it knows if it’s new or existing imported content.

Note: If you used the patch I mentioned earlier, you’ll also want to check the box for Always Download under Fetcher settings.

Click Save and add mappings, although we’re not going to add our mappings quite yet since we need to first review and test with the API a bit to figure out how we need to setup the mappings.


Under the Content tab, on the toolbar, there should be a Feeds menu item we can click into. This is where we can add the actual feed and tell Drupal which API URL to use.

Click Add feed. Give it a Title, such as Breaking Bad Characters, and a Feed URL. The Feed URL will be the URL of the 3rd party API,

Click `Save`. The Import of Save and Import button won’t work yet since we haven’t configured the field mappings, so let’s circle back to doing that now.

Jump back over to the Feed type you created earlier and click on the Mapping tab.
There are 2 steps here - Context and Field Mappings.


Looking at the `Context` setting, it’s a bit unclear what value we need to enter. According to the help text, The base query to run, it sounds like what it’s looking for is the root or base of the data. Essentially, telling Feeds how to access the set of data we want to import. We talked about it previously that this API just returns an array of objects, which is exactly what we need, but we need to figure out how to query it with the JsonPath syntax.

JsonPath Syntax

We can use a tool like to plug in the JSON and figure out how to use JsonPath to get to the data we need.

Looking at the JsonPath documentation, you would think you could just use the dollar sign ($) syntax, since it says that provides the root object/element.

In the end though the online tool, documentation and debugging, I figured out the Context setting I needed for this case was $.*.

Field Mappings

Now, we can configure Feeds to decide which value pulled in from the JSON maps to which field on the Drupal Character content type.

Let’s start with the easy one, the character name, and import into the Character node’s title field.

Choose Title (title) for the target. Under source, click New source.... In the textbox that pops up, enter name. That maps over to the name key in the API JSON for each of the character objects in the array.

We need that one because as you know, Title field is a required field on nodes. If we didn’t do that mapping, the feed import would fail and Drupal would complain that the Title field is required.

Then, we add the mappings for the other fields, including the unique identifier one I mentioned earlier. This uses the char_id as the value for Feed GUID, which is how feeds will track if the content is new or existing during the import process.

Run the Import!

Now that we have Feeds configured with the API URL, along with the context and field mapping settings, we should be good to run the import. Jump back over to the Breaking Bad Characters feed we set up earlier, and under Operations, click Import, and Import again to perform the import.

If we visit the content overview page, we can see we now have Character nodes in the list. Editing one of the character nodes, we can see the data that was imported into Drupal for the character matches the data provided by the API and how we mapped it over using Feeds.

It's a Wrap!

This is probably a very simple example of using Feeds to import JSON in Drupal, but I mainly wanted to highlight the problem areas that did occur in my case in an effort to help others who face the same issues.

Specbee: Drupal 7 to 8 Migration - A How-to guide that addresses migration Challenges (with recommendations)

Thu, 02/27/2020 - 12:24
Drupal 7 to 8 Migration - A How-to guide that addresses migration Challenges (with recommendations) Harika AND Shefali 27 Feb, 2020 Top 10 best practices for designing a perfect UX for your mobile app

Still running your website on Drupal 7 (or 6)? It’s about time to migrate to Drupal 8!

We have written extensively about why you should migrate to Drupal 8 if you’re still on Drupal 7 (or 6). Although, one of our most favorite (and significant) reasons to migrate to Drupal 8 is that … Drupal 9 is coming! And if you want to enjoy the benefits of Drupal 9, it is recommended you migrate your website to Drupal 8 first. We could argue that you should move to Drupal 8 now because there’s not enough time once Drupal 9 is here (June 2020) and Drupal 7’s EOL (Nov 2021). But you could claim that you can opt for an LTS (long term support) instead! Fair enough. Except that apart from spending more money in engaging an LTS service provider, you are also losing out on the rich benefits that Drupal 8 has to offer. Some things may seem difficult but are necessary for a stronger and a simpler future.  

Once on Drupal 8, you don’t have to “migrate” anymore – only a simple “upgrade” from Drupal 8.9 to 9, and then 9.9 to 10 and so on, will do. Migrating from Drupal 7 to Drupal 8 is not always easy and straightforward; I agree. Following a process helps but you might still face some challenges during the migration, especially if your Drupal 7 website’s content model is considerably complex. Let’s take you through a step-by-step migration from Drupal 7 to Drupal 8 with challenges that you might face. And our recommendations on how to overcome them.

Drupal 8 Migrate – Assumptions and Preparations

To be prepared is half the victory”, said Spanish novelist Miguel De Cervantes. A Drupal 8 migrate can get complicated but if you have spent enough time in planning the migration, the challenges won’t surprise you. Drupal 8’s adoption of many modern development standards like Symfony, Twig, PHP 7, etc. have led to this complete rebuild but it also calls for more powerful, robust and flexible digital experiences. Listed out are a few pre-conditions that you should remember before you begin –

  • Update your Drupal 7 website to the latest version available. This will help in cleaner automatic upgrades of some of the modules that have direct upgrade paths.
  • Make sure you have access to the Drupal 7 website’s database and files (public and private).
  • Create a backup of the Drupal 7 website and use this backup for the Drupal 8 migration. It is not recommended to migrate a live functional website although the migration itself does not make any modifications to the source.
  • Download a fresh installation of Drupal 8 from here and enable the core Migrate modules the we discussed above. And remember, it MUST be fresh! Any configurations made or content created will be overwritten when an upgrade is performed.
  • There is no direct upgrade path from Drupal 7 to Drupal 8 (unlike in previous version upgrades). Familiarize yourself with Drupal 8’s migration system. The three modules that are in core –Drupal 8 Migrate module, Drupal 8 Migrate Drupal module and Drupal 8 Migrate Drupal UI module. 
  • Make your migration choice – will you use Drush (which gives you granular control) or will you go with the browser user interface (easier but less control)?
  • Know your source. The Drupal 8 migration system’s flexibility allows content to be extracted and loaded from older versions of Drupal and various other sources like CSV, XML, JSON, MYSQL, etc.
  • Perform a thorough content audit to identify content that you need to migrate. Discard unused and irrelevant content to avoid spending time and effort in migrating them.
The Migration Process (Step by Step)
  • Observe and Plan

Identify the content types and content structure of the existing site and document the observations. This includes content types, field types, blocks, taxonomies, etc. Prepare a plan on what you need to migrate and what needs to be merged, based on these observations. Analyze the views and other site configurations and catalog them so it is easier to replicate them in Drupal 8.

  • Create a modules checklist of your Drupal 7 website

With this checklist you should be able to identify modules that you still need, or if there is a Drupal 8 version of that module, or if the module has now moved to Drupal 8 Core (like the Media module). Not all Drupal 7 modules can be automatically migrated to Drupal 8. Some Drupal 7 modules may have merged their functionality into a single Drupal 8 module. And some Drupal 7 modules may have separated their features into two or more Drupal 8 modules. It is always better to analyze such cases to be sure there is no data loss!

     Expert Recommendation – Use a module like the Drupal Migrate UI to identify the Drupal 7 modules and their corresponding Drupal 8 module (if available or not).

No module version available for Drupal 8? For example, the ImageField module in Drupal 7 does not have a corresponding D8 module. We might have to find best suitable alternative available for this in Drupal 8. Of course, we have the Drupal 8 core Media module. However, we will have to develop custom scripts to migrate the image data. migrate the image data.

Expert Recommendation – If you have just inherited a D7 website and have no idea about the customizations made to the modules OR If you have made customizations yourself and need to find them, we recommend you to use the Hacked! module. This module will go through the list of modules available on the site and the changes/customizations done to each module. 

  • Replicate and Build
    Replicate the content types, taxonomies and all the entities that are required in your D8 instance. Views must be built manually once you have the contents created and replicated. 

Expert Recommendations – 

  1. Template files(.tpl) in Drupal 7 should be written rewritten using twig files, which is part of symphony 2 framework.
  2. Make sure you rewrite your custom modules that follow symphony standards.
  • Implementing the Migration 

The most awaited and significant step has arrived. As discussed earlier, there are two ways of migrating your Drupal 7 data to Drupal 8 -

  1. Running a migration with the Drupal UI
  2. Running a migration with the Drush

The latter is recommended as it is more efficient, can be incorporated into shell scripts, and has clearer error messages. 

Based on the catalog of the content and the extracted data you need to build the migration scripts where you map the content type attributes of Drupal 7 with the newly built content type attributes of Drupal 8.

Drupal UI Method

Leveraging the Migrate UI Drupal 8 module, you can start by visiting the /upgrade path of the Drupal 8 website. The upgrade review page will show you a list of modules in your Drupal 7 site that can and cannot be automatically migrated to Drupal 8. For modules that have their functionalities in another D8 module but not exactly the same (like the AddressField module in D7 is now Address module in D8), you will need to install and enable the corresponding D8 module and restart the migration process. Based on the catalog of the content and the extracted data you need to build the migration scripts where you map the content type attributes of Drupal 7 with the newly built content type attributes of Drupal 8. Next you can go ahead with importing the data from a data source.

Drush Method –

Are you comfortable using terminal? If so, you should opt for the Drush method for migration. It provides a set of commands for the process of data migration with better status messages. Check out this tutorial if you’re looking for a step by step procedure migration using Drush commands. Never used Drush before? This guide will help you understand the basics of Drush with a list of useful commands for migration.

You might run into some conflicts now. Make sure you have checked for known issues in Drupal . org and how to fix them. Once fixed, you can now run the Migration process that gives continuous logs/feedback of the actions taken. Lastly, check the logs for any errors, fix them and you are good to go!

Expert Recommendation – Wait! Once you have the content created, never overlook the SEO/page views. We need the content to have the same URL paths. Do not forget to take care of migrating the URL aliases, meta tags information of the content from the old Drupal 7 site.

  • Testing

There are very rare times when you would run into zero issues during a Drupal 7 to Drupal 8 migrate. Once the migration is completed, a regression testing of the configuration and content freshly imported to identify any potential bugs or issues, is absolutely necessary.

Challenges and (more Expert) Recommendations
  • Many Drupal 7 contributed modules have better versions of themselves in Drupal 8 and some have been deprecated. For example, the Field Collection module, which is used for grouping fields in Drupal 7 is soon to be deprecated. This module’s functionality has been added to the Paragraphs module and Entity Reference Revision module in Drupal 8. The Drupal 8 Paragraphs module provides immense flexibility to Content editors/authors to create seamless forms and structures. If you need to migrate the Field Collection module and map it to the Paragraphs module (D8), you will need to write custom plugins to map the content between Field Collection fields to Paragraph fields. Or if you still want to continue migrating the Field Collection module even in Drupal 8, this field mapping can be handled by available Core migrate plugins. 
  • Do you use Panels for creating your landing pages like Home page, Dashboard, etc.? Even if you only need to place a block in the home page? Layout builder to the rescue! Layout Builder in Drupal 8 makes it easier for a content editor to customize a landing page. Let’s make best use of the features of Drupal 8. To migrate from Panels to Layout builder, you will need to write some custom migration plugins.
  • While migrating users, we will have to maintain the passwords as well so that the user does not have to recreate the password on the new site. Passwords are hashed content. So, you must find the hash type algorithm that is used in the source site. Next. write a process to validate the migrated password with the rehashed password using the same algorithm.
  • When running a migration, you may exhaust your system resources which may cause your migration to stop. Thanks to highwater marks, you can run the migration again and it should pick up from where it left off.
  • The widely used Features module in Drupal 7 now has almost become obsolete after the Configuration Management took over all the Features functionality and more in Drupal 8. Although the Features module is available in Drupal 8 as well, it is highly recommended to leverage Drupal 8’s Configuration Management system. It is not only simpler to work with, it is easy to export between environments, it uses YAML file formats instead of PHP – which is a more readable and proper data format.
  • Facing problems with your migration? There are several ways to report failures and get help -

        - Drupal Upgrade Issue Queue

        - Module issue queue if you find a bug or exception with a core/contributed module

        - The #drupal-migrate IRC channel on Freenode

        - The #migration channel on Drupal Slack

        - Hire a Drupal Expert


Migrating to Drupal 8 has become more important now than ever before. There is so much to look forward to in terms of features and flexibility. Moreover, upgrading to Drupal 9 and subsequent versions is going to be super easy once on Drupal 8. No two migrations are the same. You always need to tweak and modify your migration process to suit every website’s complexity. Drupal 8’s migration system provides a robust framework to move content and configurations from Drupal 7 to Drupal 8 through various data sources. 
Looking for expert Drupal 8 migration services to accelerate your business growth? Feel free to contact us and we will get back to you shortly.

Drupal Planet Shefali ShettyApr 05, 2017 Subscribe For Our Newsletter And Stay Updated Subscribe

Leave us a Comment

  Shefali ShettyApr 05, 2017 Recent Posts Image Drupal 7 to 8 Migration - A How-to guide that addresses migration Challenges (with recommendations) Image Augmented Reality (AR) – A brief Synopsis Image Drupal Paragraphs Module in Drupal 8 - A Complete Tutorial Need help migrating to Drupal 8 Schedule a call Featured Success Stories

Know more about our technology driven approach to recreate the content management workflow for [24]


Find out how we transformed the digital image of world’s largest healthcare provider, an attribute that defined their global presence in the medical world.


Discover how a Drupal powered internal portal encouraged the sellers at Flipkart to obtain the latest insights with respect to a particular domain.


Palantir: Navigating Complex Integrations, Part 2: Planning Your Integration

Thu, 02/27/2020 - 12:00

In Part 2 of our series, we provide a framework to help you ask the right questions before embarking on a complex integration project. If you haven't already, be sure to read Part 1 first.

Understanding that integrations can balloon the complexity of a project, you may be asking yourself: why do it this way? That’s a valid question!

Why Integrate? Why not consolidate?

With all the potential complexity surrounding an integration, it may be worth it to ask the question: is there an alternative option?

There are three main factors to consider when looking at the value of a potential integration:

  1. Where does the data live / who owns the data?
  2. What business logic or outcomes do we need from the data?
  3. Does the data need to be exposed to any other systems apart from the one integration we’re considering?

Often, the balance of integrating versus bringing an external service “in-house” (or into the platform) comes down to these three factors.

Data Ownership / Storage

It may well be that you cannot bring an external system into your platform natively (forcing an integration) because you don’t own the data you want to integrate with. This might be the case with any paid access to a database like a list of service providers or tweets. Someone else owns that data but allows you to expose it (or use it) on your site. You cannot bring that data into your platform natively, requiring you to go and fetch it from them.

Sometimes data ownership issues get complicated, too: just because your organization collects the data doesn’t necessarily mean you own it. This is sometimes the case with proprietary systems. Your organization may have fed the data into their system, but part of your license assigns them ownership of your data after you’ve submitted it. It is important to understand who owns the data you’re integrating with and where it ultimately can and cannot be stored.

Business Logic

Integrations usually bring with them some meaningful experience: either data to enhance some functionality or some processing of information for an outcome. For some examples, you could think here of a commenting platform that is used widely: having an account with that commenting platform may enable you to comment on many different sites on the Internet rather than needing to create an account on each site. Or, a small ecommerce site may integrate with a credit card processor that you’re already familiar with, so you can trust when you click “Buy now” and are taken to a familiar checkout experience, that the retailer is using solid technology to handle your private financial information.

In both of these examples, the meaningful outcome of interacting with this data isn’t owned by the platform itself: it is business logic that is owned by another company. The service you’re getting by integrating with them is that data transformation. The credit card processor takes a credit card number and turns that into a deposit in your bank account. The commenting platform stores all of the comments, their approval status, and who made them, so you don’t have to. Drawing bright lines around this functionality and understanding what the processes are that you’re relying on is key to understanding how hard or easy it may be to bring a particular piece of business logic to your platform, rather than outsourcing it.

Data Exposure

A system that makes the right data available to the right people at the right time can be difficult to set up and maintain! Even if you own your own data and you can replicate the business logic you need to perform on the data, you still need to evaluate whether or not other systems apart from yours rely on that information to be available in a particular format. If there are other such systems, then you’ll need to make the data available from your platform if you choose to bring that integration “in-house.”

This might mean setting up and hosting an API yourself. It might mean moving a database behind your corporate VPN. It might mean needing to comply with data protection regulations that you didn’t have to worry about before (like HIPAA or GDPR).

It is certainly possible to do (and sometimes it is very beneficial), but it does require careful examination of the whole system.

How to plan an integration

With all that context, let’s set up a scenario: you’ve been asked to plan an integration. Your company is planning to start selling widget online and would like to use a third-party shopping cart and credit card processing. How do you start planning for this discrete but complex set of integrations?

Define the Benefits

Like most plans, it helps to start by defining your desired end state. When this integration is done, what will your users be able to do? Will there be a new button on your website? Ten new buttons? Text entry fields? What is the value or benefit to the customer?

Write it down!

Put All the Cards on the Table

Next, investigate what information will flow through the system. Document it in as much detail as you can. Will there be products? If so, they probably have lots of data with them (prices, quantities, sizes, availability, etc). What about users? What is a user (a username/password combo? An email address? A shipping address and credit card)? Try to be as specific as you can be about both the data and the composition of the data you’re going to be working with. (If you’re integrating with a good third-party system, they may already have documentation about the data and data composition that you can use.)

Draw a Flowchart

A simplified view of the flow of information through a system. This documents requests that can happen, storage locations, and data transformations to make the system complete. This documents five integrations that are codependent.

Once you have a good understanding of both the outcomes you’re aiming for and the pieces you have, you can start to line up the pieces sequentially in the order they’ll be used/consumed by the systems. Block out areas for where there will be business logic or transformations (these typically slot in right before storage, before display, or after initial read).

As you go through this process, make sure you write down your questions. If you don’t know what data looks like at some point during the process, note it! That is an area that could drive up complexity (or you might be able to discover the answer to that question and disambiguate the system a little more).

Your flowchart will drive your understanding of the complexity of the integration and now it’ll be a whole lot less ambiguous, too!


Integrations can be scary! They’re inherently complex: you’re dealing with data transportation, information ownership, data models, business logic, user experience, and display logic. Until you have a good framework for breaking down that complexity into some core component pieces, it can be really hard to wrap your head around how they all fit together.

By taking this approach, we at Palantir are able to estimate the complexity of particular integrations quickly during both our sales estimating and project planning phases of our projects.

Integration by Christer licensed under CC BY-NC-ND 2.0.

Strategy Industries Government Healthcare Higher Education

OpenSense Labs: Infinity Stones Of Drupal Development

Thu, 02/27/2020 - 10:10
Infinity Stones Of Drupal Development Shankar Thu, 02/27/2020 - 15:40

The Pulse of the profession 2020 report revealed that nearly 11.4% of investment gets wasted because of poor project performance. Businesses, who often undervalue project management as a strategic competency for driving change, see their projects failing miserably. Managing a project can turn out to be a slippery slope when an organisation doesn’t have a solid grasp of all the moving pieces.

When it comes to managing a website development project, with so much at stake and so much in flux, adhering to best practices becomes immensely important. Drupal development is no different. From site-building to theming to backend development, doing everything right during the Drupal development process can be a formidable task. Every stage of the Drupal website development process should be taken care of in order to get the desired result. We call these stages as the Infinity Stones of Drupal development.

In Avengers film series, Infinity Stones are said to be the group of gems that can grant an unprecedented amount of power to their owner. Marvel Cinematic Universe timeline is packed with instances where these Infinity Stones are mentioned. The Drupal world has its own set of gems that, when put together in the right manner, can lead to better project delivery. These are the essentials that have to fall in place during the development period to ensure the best yield.

Requirement gathering and analysis (Reality Stone)

The Reality Stone helps its owner in manipulating matter. In Drupal development, the requirement gathering and analysis phase acts as the Reality Stone.

Here, you receive inputs from all the stakeholders such as customers, salespeople, industry experts and developers. All the relevant information is gathered from the customer to build a solution that meets their expectations.

For instance, by setting up a meeting, a lot of things can come to an agreement between client and vendor. A plenitude of factors, that will majorly affect different parties involved with the project, can be figured out. Questions like what’s the budget, what will be the deadline, who will be the end-user, what’s the purpose of the building this website and other such questions could be raised and discussed. This stage helps resolve any ambiguities that might become a problem later on.

Architecture and design (Mind Stone)

Using Mind Stone, an artificially intelligent peace-keeping program called Ultron was built. This powerful stone allowed its owner to control the minds of others. The architecture and design stage of Drupal development is equivalent to Mind Stone. Once you have gathered the requirement from all the stakeholders, the next stage involves the process of understanding what was exactly in the mind of stakeholders and deriving a functional architecture out of it. All the stakeholders can then analyse the design specification and offer their feedback and suggestions. 

In this stage, several important questions can be pondered over like which of the core and contributed modules in Drupal can be utilised, should you go for Drupal 7 or Drupal 8 that will eventually decide the complexity of upgrade path to Drupal 9, should you go for headless approach etc. Any mistake that happens during the process of gathering stakeholders’ inputs and preparing the design and architecture plan would lead to cost overruns and project failure.

Development and implementation (Power Stone)

Someone, who has acquired the Power Stone, would receive a colossal amount of energy that can even be used to destroy an entire planet. Development and implementation stage is the Power Stone of Drupal website building process. Here, developers put in all their energies into the actual development process i.e. to turn visualisation into reality.

Implementation phase begins on the basis of the design document. The design is translated into source code. A surfeit of code reviews takes place. The coding process follows the best practices of Drupal development like creating a clean content architecture by including all the fields and content types in the content structures; or choosing limited content types and fields to make things easier for content creators; or creating separate issues and patches to update the existing code and so on.

Testing (Space Stone)

In Marvel films, there is a stone that is hidden inside a blue cube called the Tesseract. The Space Stone, as it is called, gives its owner the power over space. You can create a portal from one part of the universe to another. The testing phase is your Space Stone of Drupal development.

Testing phases begins once the coding gets completed. To find out every possible defect, testers have to open all the portals and do their magic. They have to keep a note of all the errors that they find during their intensive search in all of the portals. From preparing test cases to displaying the scope for more improvements in the user interface, user experience and speed, testers make a list of all the defects. These are then assigned to the developers to get them fixed. Regression testing is performed until the end product meets the client’s expectation.

Deployment (Soul Stone)

Capturing and controlling others’ souls would call for a Soul Stone. The deployment phase is the Soul Stone of Drupal website creation process. This is the phase where you have got the final output after countless hours of hard work and are ready to see the result. You have got complete control over the resulting website.

The Drupal website, that has been built after all the requirement gathering and coding processes, is deployed in the production environment (or first UAT (User Acceptance Testing) is done based on the client’s needs).

Maintenance and improvement (Time Stone)

Doctor Strange uses the Time Stone to trap a villain in a time loop and stop him from destroying the Earth. Whether you need to rewind or fast-forward the time, the Time Stone has got you covered.

The maintenance phase kicks in once the deployment is taken care of. This is the Time Stone of Drupal project management. In case any issue springs up during UAT tests or in the production environment, the developers can quickly rewind and check back the processes involved to find out where the error lies in. The improvement becomes easier as you go from incipient stage to the deployment stage or vice versa, find out the exact areas that are in need of enhancement or are under the threat of production downtime and act on it accordingly.

Project management (The Infinity Gauntlet)

The fancy golden glove, that Thanos wears with all the infinity stones fixed on it, is The Infinity Gauntlet. In Avengers, the Gauntlet, holding all the Infinity Stones, is the most sought-after thing. With all the stones united in the Gauntlet, its owner wields a massive amount of power. Project management is key to Drupal development and plays the role of Gauntlet. Like Thanos, with the snap of a finger, the project managers will be able to deliver the Drupal projects with ease if they ensure all the stages of Drupal development are followed accurately and effectively. Project managers are pivotal as they enable the application of knowledge, skills, tools and techniques to project activities so that the end result meets project requirements and ensures client satisfaction.

End thoughts

If Thanos, with his Gauntlet and the Infinity Stones on it, can become the most powerful being and go about destroying the universe, the Drupal project can benefit from its Infinity Stones too. The Infinity Stones of Drupal Development, starting from the requirement gathering and the designing to the deployment and the maintenance, if followed efficaciously, can make the process of creating a Drupal web property smoother and quicker.

With the presence of timely and efficient project management, everything falls in place. OpenSense Labs takes utmost care in adhering to best practices whilst developing a Drupal site. Ping us at to know how we can bring all the Infinity Stones in its places and build an innovative Drupal website for your organisation.

blog banner blog image Drupal Drupal 8 Drupal development web development web development project Project management Infinity Stones Blog Type Articles Is it a good read ? On

Promet Source: The SEO and UX Connection

Wed, 02/26/2020 - 17:33
In our current, digitally driven business climate, search engine optimization (SEO) and optimal web experiences are inherently intertwined. Each feeds off and builds upon the other. 

Drudesk: Integrate social media on your website via helpful Drupal 8 modules

Wed, 02/26/2020 - 16:15

New days dictate new rules on the web. Having social media integration buttons on your website today is one of the crucial web design tips to ensure your business success.

If you have a website on Drupal, this post will be of special interest to you. We will discuss how social media integration works in Drupal 8, what modules there are in this area, and how to integrate social media on your website using one of them — the Easy Social Drupal module.

Droptica: And Metadata in Drupal

Wed, 02/26/2020 - 13:24 metadata is one of the most important SEO optimisation issues. This is what the extensions of HTML documents that allow search engine robots to better understand the meaning of individual subpages of your website are being called. Handling the visitor's metadata in Drupal already since version 7. In this article, I will present different ways to implement them.

Lullabot: Sending a Drupal Site Into Retirement Using HTTrack

Wed, 02/26/2020 - 12:58

Maintaining a fully functional Drupal 7 site and keeping it updated with security updates year-round takes a lot of work and time. For example, some sites are only active during certain times of the year, so continuously upgrading to new Drupal versions doesn't always make the most sense. If a site is updated infrequently, it's often an ideal candidate for a static site. 

Mediacurrent: Open Waters Ep 11: Enterprise Marketing with Lynne Cappozi

Tue, 02/25/2020 - 19:00


In this episode, we're joined by Lynne Cappozi, Acquia’s CMO.  Lynne weighs in on how to maximize your investment in Acquia products, top digital marketing challenges, and how open source is changing the game for marketers. 

About Lynne

Lynne is one of Acquia’s boomerang stories, first serving as CMO in 2009 and returning to Acquia in 2016 to lead the marketing organization into its next stage of growth. Prior to her experience at Acquia, Lynne held various marketing leadership roles in the technology space at companies such as JackBe, Systinet & Lotus Development, all of which were acquired during her tenure. Outside of her work at Acquia, Lynne is on the board of directors at the Boston Children’s Hospital Trust and runs a nonprofit through the hospital.

Audio Download Link

Project Picks
  1. CVent
  2. GoGoGrandparent
  • Tell us about yourself and your role at Acquia.
  • What does Acquia do?
  • How has open source changed the practice of marketing for Acquia’s customers?
  • What kind of organizations make up Acquia’s customer base?
  • Being a marketer yourself, what do you see as the biggest challenge for enterprise marketers as we head into 2020?
  • What is Acquia doing to help marketers overcome those challenges?
  • Where do digital agencies like Mediacurrent fit into Acquia’s ecosystem?
  • What can marketers do to get the most value out of their investment in Acquia products?

Thanks for tuning in for another episode of Open Waters!  Looking for more useful tips, technical takeaways, and creative insights? Visit to subscribe and hear more episodes.


Tag1 Consulting: Insider insights on the commercial and API landscapes (part 1)

Tue, 02/25/2020 - 18:59
Over the last five years, decoupled Drupal has grown from a fringe topic among front-end enthusiasts in the Drupal community to something of a phenomenon when it comes to coverage in blog posts, tutorials, conference sessions, and marketing collateral. There is now even a well-received book by this author and a yearly conference dedicated to the topic. For many Drupal developers working today, not a day goes by without some mention of decoupled architectures that pair Drupal with other technologies. While Drupal’s robust capabilities for integration are nothing new, there have been comparatively few retrospectives on how far we’ve come on the decoupled Drupal journey.Read more preston Tue, 02/25/2020 - 19:27

Mediacurrent: Mediacurrent Sessions at DrupalCon 2020

Tue, 02/25/2020 - 17:19

DrupalCon 2020 sessions are here! The Mediacurrent team is proud to present 9 sessions at this year’s annual conference in Minnesota. 

With topics ranging from Drupal 9 to personalization to tips for preventing burnout, we’ll be sharing our Drupal knowledge from all angles. Here’s the presentation line up: 

Site Building, Development, and Coding

Page building showdown: Paragraphs v Layout builder
Presented by: Jay Callicott, VP of Technical Operations  

Join this session for an honest comparison of the current champ in Drupal 8 contrib, Paragraphs, versus Layout builder. 

Managing images in large scale Drupal 8 & 9 websites
Presented by: Mario Hernandez, Head of Learning 

Knowing how to properly configure your site to handle images can make a big difference in converting leads, getting more sales and getting more visitors on your site. On the JAMStack with Gatsby and Drupal 8
Presented by: Bob Kepford, Director of Development and Ally Delguidice-Bove, Digital Strategist. Plus Sanjay Naruda, MagMutual and Ben Robertson, Gatsby

This session will be an inside look at our decoupled approach for combining open-source frameworks like Gatsby, Drupal 8, and Serverless, as well as third-party services for user management, a learning management system, and private APIs to build a robust custom platform.

Being Human, Contributions, and Community

How to plug into your passion and prevent burnout
Presented by: Brian Manning, IT Operations Manager and Victoria Miranda, Project Manager

Learn how to identify burnout — for yourself, your team, and your project.

Creating an organizational culture of giving back to Drupal
Presented by: Dave Terry, Co-founder and Partner 

Explore Mediacurrent’s journey around creating a culture of giving back and get inspired with actionable ideas.

Content and Digital Marketing 

Contextual, not creepy: Personalization tools, tricks, & tips 
Presented by: Ally Delguidice-Bove, Digital Strategist 

In this session, see how Empathy Mapping can help you create contextual and personal experiences for your users. 

Digital psychology & persuasion to increase user engagement
Presented by: Cheryl Little, Senior Director of User Experience; Becky Cierpich, UX/UI Designer; Danielle Barthelemy, Senior Digital Strategist  

Explore the psychological principles that drive human behavior and learn about the tools and techniques that can be used to captivate visitors' attention and enhance the user experience on your website. 

Leadership, Management, and Business

From tech expert to team leader: Lessons for making the leap
Presented by: Kelly Dassing, Senior Director of Project Management and Mark Shropshire, Senior Director of Development  

If you’re in the process of transitioning from a technical role to management, this session is for you! 

User Experience, Accessibility, and Design 

One usability step at a time: Improve your site with a UX audit
Presented by: Cheryl Little, Senior Director of User Experience and Becky Cierpich, UX/UI Designer

Start off on the right foot when planning website improvements. See a UX audit can help.

Summit and Training Events

Register to join Mario Hernandez and Eric Huffman for a tutorial on  Component-based theming with Twig. If you're in the healthcare field,  be sure to join the Healthcare Summit where Mediacurrent is a presenting sponsor. 

Electric Citizen: Get Ready for DrupalCon Minneapolis

Tue, 02/25/2020 - 16:02

After over a decade of our team traveling to other cities, the annual DrupalCon North America is coming to our hometown! 

I've been attending DrupalCons each year, starting with DrupalCon Chicago in 2011. While I've had an incredible time visiting all these other great cities across the US, there's something special about being the host city. And I'm confident you'll love it too.

Here's your short to-do list:

Palantir: Navigating Complex Integrations, Part I: Understanding the Landscape

Tue, 02/25/2020 - 12:00

In the first part of this two-part series, we explore the factors that drive complexity when integrating third-party data sources with large-scale digital platforms

We use the word integration a lot when we’re talking about building large-scale digital platforms. The tools we build don’t stand in isolation: they’re usually the key part of an entire technology stack that has to work together to create a seamless user experience. When we talk about “integrating” with other services, usually we’re talking about moving information between the digital platform we’re building and other services that perform discrete tasks.

Platforms are not encapsulated monoliths. For just about any feature you could imagine for a platform, there may be a third-party service out there that specializes in doing it and you can optimize (for cost, output, functionality, ease of use, or many other reasons) by choosing to strategically integrate with those services. When we architect platforms (both big and small), we’re usually balancing constraints around existing vendors/service providers, existing data sets, and finding cost and functionality optimizations. It can be a difficult balancing act!

Some examples of integrations include:

  • On a healthcare website, clicking the “Make an Appointment” button might take you to a third-party booking service.
  • On a higher-education website, you might be able to view your current class schedule, which comes from a class management system.
  • On a magazine site, you might not even know that you’re able to read the full article without a paywall because the university network you’re browsing from integrates with the publisher’s site to give you full access.
  • On a government website, you might be able to see the wait time for your local Department or Registry of Motor Vehicles.

In short: an integration is a connection that allows us to either put or retrieve information from third-party data sources.

What Drives Complexity in Integrations

The main factors that drive complexity in integrations:

  1. Is it a Read, Write, or Read/Write integration?
  2. What is the data transportation protocol?
  3. How well-structured is the data?
  4. How is the data being used?
Read, Write or Read/Write

When we talk about reading and writing, we’re typically talking about the platform (Drupal) acting on the third party service. In a Read-only integration, Drupal is pulling information from the third-party service and is either processing it or else just displaying it along with other information that Drupal is serving. In a Write-only integration, Drupal is sending information to a third-party service, but isn’t expecting processed data back (the services will often send status messages back to acknowledge getting the data, but that’s baked into the process and isn’t really a driving factor for complexity). The most complex type of integration is a Read/Write integration: where Drupal is both writing information to a third-party service and also getting information back from that third-party service for processing or display.

Access Control

It is impossible to separate the idea of accessing information from the question: is the data behind some type of access control? When you’re planning an integration, knowing what kind of access controls are in place will help you understand the complexity of the most basic mechanics of an integration. Is the data we’re reading publicly accessible? Or is it accessible because of a transitive property of the request? Or do we have to actively authenticate to read it? Write operations are almost always authenticated. Understanding how the systems will authenticate helps you to understand the complexity of the project.

Transportation Protocol

In thinking about the transportation protocol of the data, we expand this definition beyond the obvious HTTP, REST, and SOAP to include things like files that are FTP’ed to known locations and direct database access where writing our own queries against a data cache. The mechanics of fetching or putting data affect how difficult the task can be. Some methods (like REST) are much easier to use than others (like FTP’ing to a server that is only accessible from within a VPN that your script isn’t in).

REST and SOAP are both protocols (in the wider sense of the word) for transferring information between systems over HTTP. As such, they’re usually used in data systems that are meant to make information easy to transport. When they’re part of the information system, that usually implies that the data is going to be easier to access and parse because the information is really designed to be moved. (That certainly doesn’t mean it’s the only way, though!)

Sometimes, because you’re integrating with legacy systems or systems with particular security measures in place, you cannot directly poll the data source. In those cases, we often ask for data caches or data dumps to be made available. These can be structured files (like JSON or XML, which we’ll cover in the next section) that are either made available on the server of origin or are placed on our server by the server or origin. These files can then be read and parsed by the integrating script. Nothing is implied by this transportation method: the data behind it could be extremely well structured and easy to work with, or it could be a mess. Often, when we’re working with this modality, we ask questions like: “how will the file be generated?”, “can we modify the script that generates the file?”, and “how well structured is the data?”. Getting a good understanding of how the data is going to be generated can help you understand how well-designed the underlying system is and how robust it will be to work with.

Data Structure

When data folks use phrases like “data structure,” I think there’s a general assumption that everyone knows exactly what they mean. It is one of those terms that seems mystical until you get a clear definition, and then it seems really simple.

When we talk about the data structures in the context of integrations, we’re really talking about how well-organized and how small the “chunks” of data are. Basically: can you concisely name or describe any given piece of information? Let’s look at an example of a person’s profile page. This could be a faculty member or a doctor or a board member. It doesn’t matter. What we’re interested in finding out is this: when we ask a remote system to give us a profile, it is going to respond with something. Those responses might look like any of the following:

  • Composed (“pre-structured”) response: Name: Jane Doe, EdD. Display Name: Samantha Doe, EdD
  • Structured Response: First Name: Samantha Last Name: Doe Title: EdD.
  • Structured Response, with extra processing needed: First Name: Jane Last Name: Doe Title: EdD. Preferred Name: Samantha

All of these responses are valid and would (presumably) result in the same information being displayed to the end user but each one implies a different level of work (or processing) that we need to do on the backend. In this case: simpler isn’t always better!

If, for example, we’re integrating with this profile storage system (likely an HR system, or something like that) so we can create public profiles for people on a marketing site, we may not actually care what their legal first name for HR purposes is (trust me, I’m one of those folks who goes by my middle name—it’s a thing). Did you catch that in the third example above this person had a preferred name? If the expected result of this integration is a profile for “Samantha Doe, EdD.”, how do we get there with these various data structures? They could each require different levels of processing in order to ensure we’re getting the desired output of the correct record.

The more granularly the information is structured, the easier it is to process for the purpose of changing the information.

At the other end of the spectrum is data that will require no processing or modification in order to be used. This is also acceptable and generally low complexity. If the origin data system is going to do all of the processing for us and all we’re doing is displaying the information, then having data that is not highly granular or structured is great. An example of that might be an integration with a system like Twitter: all you’re doing is dropping in their pre-formatted information into a defined box on your page. You have very little control over what goes in there, though you may have control over how it looks. Even if you can’t change the underlying data model, you can still impact the user’s experience of that information.

The key here, for understanding the complexity of the integration, is that you want to be on one extreme or the other. Being in the middle (partially processed data that isn’t well-structured) really drives up effort and complexity and increases the likelihood of there being errors in the output.

Data Usage

One of the best early indicators of complexity is the answer to the question “how is the data being used?” Data that is consumed by or updated from multiple points on a site is generally going to have more complex workflows to account for than data that is used in only one place. This doesn’t necessarily mean that the data itself is more complex, only that the workflows around it might be.

Take, for example, a magazine site that requires a subscription in order to access content (i.e., a paywall). The user’s status as a subscriber or anonymous user might appear in several different places on the page: a “my account” link in the header, a hidden “subscribe now!” call to action, and the article itself actually being visible. Assuming that the user subscription status is held in an external system, you might now be faced with the architectural decisions: do we make multiple calls to the subscription database or do we cache the response? If the status changes, how do we invalidate that cache throughout the whole system? The complexity starts to grow.

Another factor in the data usage to consider is how close the stored data is to the final displayed data. We often refer to this as “data transformations.” Some types of data transformations are easy and while others push the bounds of machine learning.

If you had data in a remote system that was variable in length (say, items in a shopping cart), then understanding how that data is stored AND how that will be displayed is important. If the system that is providing the data is giving you JSON data, where each item in the cart is its own object, then you can do all kinds of transformations with the data. You can count the objects to get the number of items in the cart; you can display them on their own rows in a table on the frontend system; you can reorder them by a property on the object. But what if the remote system is providing you a comma-separated string? Then the display system will need to first transform that string into objects before you can do meaningful work with them. And chances are, the system will also expect a csv string back, so if someone adds or removes an item from their cart, you’ll need to transform those objects back to a string again.

All of this is rooted in a basic understanding: how will the data I’m integrating be used in my system?

Come back on Thursday for Part 2, where we’ll provide a framework to help you make sure you’re asking the right questions before embarking on a complex integration project. 

Complex Adaptive System by Richard Ricciardi licensed under CC BY-NC-ND 2.0.

Strategy Industries Government Healthcare Higher Education

Microserve: A commitment to quality: Creating a robust QA process

Tue, 02/25/2020 - 09:48
A commitment to quality: Creating a robust QA process Joe Bransby Tue, 02/25/2020 - 09:48

Having been working in software quality assurance for over 10 years, I have helped many organisations to set up internal QA teams and a vigorous QA process from scratch.  It’s both a rewarding and challenging task! 

As part of Microserve’s commitment to producing high-quality work for our clients, I joined the team in 2018, tasked with building a QA team and creating a suite of processes. One of our values as a business is ‘excellence as standard’. Prioritising quality in this way would ensure that we would provide our clients with the excellence we strive for. Blog: Developer guide to better UI/UX design

Tue, 02/25/2020 - 07:34

In this post, I'll share the basic principles of UI/UX design that I follow as a developer while working on projects which have little or no designs prepared by the client. I hope they'll help you to optimize your workflow and lead to the greater satisfaction of your clients.


Blue Drop Shop: Drupal Recording Initiative: #DrupalCampNJ and #FLDC20

Mon, 02/24/2020 - 21:48
Drupal Recording Initiative: #DrupalCampNJ and #FLDC20 kthull Mon, 02/24/2020 - 15:48

I post updates on LinkedIn and to backers of the Drupal Recording Initiative, but I suppose blasting these via Planet Drupal is also a good idea. Well, at least until adds that functionality (you can track that issue here).



That puts the total number of captured sessions at 2,147. If you find these session recordings valuable, please consider supporting my efforts. The US is fairly well now it is time to focus on the rest of the world. blog: Request for Sponsors: Automatic Updates Initiative Phase 2

Mon, 02/24/2020 - 21:40

The Drupal Association is seeking partners to help us advance the next phase of the Automatic Updates initiative.

The first phase of this work was generously sponsored by the European Commission, and supported by other partners including: Acquia, Tag1Consulting, Mtech, and Pantheon.

In this first phase, we accomplished a great deal:

  • Display of security PSAs directly in Drupal's admin interface
  • Automated readiness checks, to ensure that a site is prepared for updates
  • Automatic updates for Drupal Core in both Drupal 7 and Drupal 8.

But while this work laid the foundation, a great deal of work yet remains. The next phase hopes to add support for:

  • Sites managed using Composer
  • Automatic updates with Contributed modules
  • A front-end controller providing support for easy roll-back

The Drupal Association needs partners in order to move this work forward. We're looking both for organizations who can provide financial support, and teams who have expert developers who can contribute to development.

If you are interested, you can find a detailed scope of the remaining work attached to this post.

Download the Request for Sponsors

Contact: with questions.

Drupal Association blog: Drupal contribution culture - your opinions, experience and perspectives matter

Mon, 02/24/2020 - 21:06

How do we encourage those capable of giving back to Drupal to start doing so and once they are contributing how do we encourage them to do more? Dries highlighted this conundrum during his keynote at DrupalCon Amsterdam 2019.

Whilst various mechanisms exist to recognise contributions in Drupal, if we are to cultivate and grow contribution culture we need to move beyond the current status quo. At DrupalCon the Contribution Recognition Committee was proposed and self nomination invited.

The purpose of this committee is to recommend solutions for how we recognize contributions to the Drupal project made by both individual and organizational contributors, and to advise the Drupal Association on how to weight each type of contribution relative to the others.” Tim Lehnen, Chief Technology Officer Drupal Association.

What have we achieved so far?

For several months now, newly appointed committee members have been researching and discussing contribution culture within Drupal and open source. To ensure recommendations are truly representative of organisation and individual contributions we are keen to canvas opinions and perspectives from far and wide.

Share your opinions, ideas and perspectives

An online survey is available now for those using or contributing to Drupal so they can provide insights which will be considered in our recommendations as a committee. I encourage you to participate and help us to reach members of the Drupal community in your local area.

Complete the survey today