In this article will see how to use the Drupal migration framework to migrate custom sites to drupal 8.
This is the latest post of the “Improving Drupal and Gatsby Integration” series. This time I will be talking about the Gatsby Boina Starter; we are contributing to make your Drupal-Gatsby integration easier. The Boina starter ships with the main Gatsby configuration files you might need to get up and running on your Gatsby site.jmolivas Fri, 02/01/2019 - 17:47
For why should you trade Drupal's battle-tested content authoring and administration tools for a more interactive user experience?
The hotels we chose each offer an ideal hub—connecting you to a rewarding DrupalCon community experience.
As mentioned previously, we have been collaborating across the Drupal community on updating and, expanding Drupal.org/community and that work is ongoing. There are still wrinkles to resolve, such as how to make the menus on that page more obvious, but we are getting there:Next step - community group sections
One of the things I was especially keen to do was to make areas for the groups of people that make our community work available under /community and give them the tools and space to tell the World about:
- Who they are
- What they do
- How they work
- What their latest updates are
- How you can get involved.
Well, the framework to do this looks good and the first couple of sections are now available. You can see the following community groups already:
Each section will have a “standard” home page content, detailing the info above, as many content pages as the group can muster and a blog that will go onto Drupal Planet.
Of course, a group will likely have content across many different parts of the Drupal.org website. I’m especially keen for all members of our community to be able to see what groups there are and how they work in one easy to consume place. Our project values challenge us all to clearly define how our community functions: "We foster a learning environment, prefer collaborative decision-making, encourage others to get involved and to help lead our community."What about the community group you are a member of?
If you represent a community group and would like to join the growing list of those with sections under /community, please get in contact
I’m looking at globally-relevant groups right now - maybe in the future, we will look at what we can do to support local groups.
Imagine what could be possible when new members of our community come to /community and find right where they belong! I'm excited to see what's next.
The number eight in bible signifies resurrection and regeneration, a digit that implies “New beginnings”
Just like the resurrection of Drupal which loudly announced its new inceptions as a content management system, and its ability to connect with Saas CRM like Salesforce.
Salesforce is like the heart for most of the business that has allowed them to handle there sales data at one stop and given highest priority in terms of customer’s growth. And now that it has a tighter integration than ever before, Drupal 8 can do it too
Benefits of Integrating Drupal and Salesforce
So instead of wasting any more of your time and beating around the bush, let's explore the paths that lead down its integration and the key considerations that are involved in it.
When you have a team of salespeople small or big, how would you manage that which territories or which areas each of them is going after?
Definitely with the help of Salesforce that would monitor and track almost anything that you can imagine of. Instead of managing those old school spreadsheets, a CRM like Salesforce can help you track and monitor all the tasks. It saves your time, resources that arrives while managing small and large scale teams. With the help of this CRM, you have the power to do many things such as:
- Better management of lead processing and territories.
- The leads can be assigned to the users according to according to the data that have business sense.
- Instant email notifications can help in rep up the sales of the customers and prospects immediately.
- It can help you attain better efficiency.
Tracking competitors and managing opportunities
In this competitive world, it is important to track and manage your competitors. You can do this thing with the help of Salesforce CRM.
It diligently ensures that each and every opportunity is followed up on and not forgotten through the various in-built tools and responses faster to any client that enquires about your services or products which shows ultimately to your customers that you care about their business.
A good CRM system gives you the ability from a business point of view to track exactly what is happening but also accurately forecast the growth or decline of your business. For forecasting, salesforce can also provide you with:
- Calculate the forecasting including all the information from the sales team.
- Differentiate between booked and recurring venues
- Customize forecast based on the parameters that make sense to the business.
The Salesforce CRM allows you to truly manage end to end customer relationships. You can see everything from the first time when you engage with a client to when they place an order and beyond.
The best part about Salesforce CRM in terms of managing order is that it can easily turn an estimate into order and beyond with a single click of a button and customized or automated reports based on what you need to seeArchitectural approaches
There are different architectural approaches to have you think about data flow that provides for different requirements and satisfy different needs. Architectures like:Technology Description Strengths Weakness Real-Time Push Sends data immediately on the entity and creates, updates and deletes Fast, limited, update lag, avoids UX, can avoid race conditions Less durable and reliable Cron Based Sync Identify records requiring sync on the cron Handles large volumes well, can be stopped and start as needed Slow, lags and risk the update conflicts Work Queue Single point of integration receives data and action Reliable, performant and has a shorter time lag Large changes create backlogs, the risk of update conflicts
With the real-time integration, the Drupal objects are exported to salesforce immediately. You get the feedbacks indicating whether the item failed to export and the data is available in Salesforce or not. This can be a great option if you need the data to be in the salesforce as close as possible.
Cron Based Sync
Earlier in Drupal 7 the asynchronous push left hiccups concerning error handling (which involved debugging and troubleshooting) optimization, API calls etc.
Now in Drupal 8 salesforce cron based push service has been introduced to construct database queues, normalizing queue items, optimizing queue operations and implementing error handling.
The Cron based sync has helped Drupal’s core API schedule synchronization from salesforce to Drupal.
With the queue-based batching system running in the background, it allows many objects to be sent to the salesforce as soon as possible. Instead of the objects being sent to the Salesforce at the same time. In this architecture, instead of the objects that are being sent to the salesforce as soon as it is created, edited, deleted it goes into the queue where it waits to be exported to other items.
The queues items are then picked up on the configurable schedule and then exports to the Salesforce in batches. Batching the data helps in synchronization and helps to increase the performance by using fewer API calls.Approaches suitable for integration
There are many ways to move your data from the website to another application. Drupal and Salesforce out of which is the easiest and allows integration in almost all projects. Here are some approaches which are suitable to integrate Drupal and Salesforce.
Simple web formsSalesforce lets you create simple HTML web form (Web-to-lead or web-to-case) that generates lead or case records in Salesforce when they are submitted.
Anyone of the Salesforce administrator can create these forms and then paste them in Drupal for the users to complete it.
While not all of the things are addressed in every circumstance, there are specific situations when this method is a good solution:
- A basic idea on the user data or inquiry information into Salesforce is needed.
- There is no or little expertise in web development.
- Something quick and easy is needed.
Third party form service
There are an ample number of form services like Formstack, click and pledge and Wufoo that have the power to pass the data to Salesforce. In this, you can either embed the form in Drupal or let the user click through the platform.
This method is suitable when the following conditions are applied:
- When there is a need to pass both user and transaction data into Salesforce.
- There is no need to move the information in both the directions.
- You may want users to log in to submit a form or return to the form and provide more information later.
- You want sophisticated solutions that don’t really need to be customized
The Salesforce Suit is the collection of all the Drupal modules that allow synchronization of all the data and the information that is between Drupal and Salesforce in single or both directions. This suit also has the ability to provide a mapping tool that can be used to define the integration which is field-by-field and object-by-object.
The simplest way to hook Drupal up (or any other website) with Salesforce is by simply linking over to a form that is created by the Salesforce. Any data that the user is entering gets dumped directly to the salesforce and Drupal is not involved in it.
This type of method is good for a lead generation or simple application form. One of the biggest advantage in using salesforce forms is that it is not only cheap and easy to use, but there are zero setups that are done on Drupal side besides providing a link to the form.
There might be instances where you might have content that constitutes in both Drupal as well as Salesforce and is needed to stay in sync. Salesforce mapping does that task for everyone.Salesforce mapping keeps the version of the data at both ends, whatever happens to one happens to the other version too.
Rules can also be made to add, delete, push or pull data.
Cost Direction Complexity Simple Web Forms Free One direction inbound to Salesforce DIY Third Party Form Service Low One direction DIY or Developer Assistance Salesforce Suite Moderate To High Bi-directional Developer Assistance Salesforce Mapping High Double-entering the same content in two places Developer Assistance Salesforce Forms Low Natural DIY Integrating with different Directions Integrating with One Direction Integrating with two directions Useful when When you have to pass user data, transaction data, and specific node types When data is entered directly into Salesforce To keep Integration Simple Modern Advantage This approach limits complexity and therefore liability and errors. Fewer duplicate records are created in Salesforce. User Experience No updates required to impact UX Users need sophisticated interaction such as the ability to view offline data you have entered Use Cases Donation forms, Event registration Donation forms, Event registration Drupal modules are here to ease the integration with Salesforce
The Drupal Salesforce Suite module is a testament to both the ingenuity and passion of the Drupal community and the flexibility of Drupal as an enterprise platform. As a contributed module, the Salesforce Suite for Drupal enables out of the box connection with Salesforce, no matter what your configuration is. It supports integration by simply synchronizing Drupal entities (eg users, nodes) with the Salesforce objects (organization, contacts)
The Drupal community, as a matter of fact, has been contributing a lot to this part. It has come together to sponsor the development of the suite of Salesforce integration modules that can deal with a variety of business needs. To rewrite the module, the community gathered time and the resources, taking full advantage of the advances that were made in the Drupal and Salesforce platforms. To put it all together it now has been rearranged into a modular architecture exposing core functionality via an API enabling other systems, E.g., Springboard, Jackson River’s fundraising platformMost importantly the Drupal suite module has authorized Auth 2.0 to its highest access control
For the non-technical users, the Drupal entity and Salesforce object mapping system has provided them with the power to configure the data maps between any objects in any 2 systems. Not only this but the synchronization between any Drupal entity and Salesforce object, E.g., Drupal users, donation receipts has been made easy. It has presented its users with a lightweight wrapper around the SOAP API, which has more capabilities for some use cases, using the same OAuth authorizationExamples of the Use case of Drupal and Salesforce Integration
A packaged distribution of Drupal for non-profit organizations, Springboard, Jackson River’s innovative solution (for online fundraising and marketing), needed to accept online donations and wanted to use Drupal to power other user touch points such as petitions, email registration and more. Springboard presented a robust integration queue for bi-directional sync of data between Drupal and Salesforce.com CRM.
RedHen CRM has been designed for the needs of membership organizations and associations, the RedHen framework is extensible and flexible and can be leveraged to produce a broad range of CRM solutions. For instance, RedHen could be used as a light-weight sales pipeline management tool for small businesses. RedHen CRM could also be leveraged as an integration point between Drupal and much larger, enterprise CRM solutions such as Salesforce.Case studies on Cornell University
The university offers hundreds of opportunities to the students including the ones that are living aboard. But to take the advantages of the opportunities the student had to navigate a full maze of departments and websites. Thus, to solve this issue Cornell University Experience Initiative (CUEI) came up with a plan to bring out a “Netflix” like experience to the students that provide customizable user guide making it easy for the students and the opportunities.
An organization known as Pantheon was chosen. They wanted to maintain there content with Drupal but also wanted to manage student application and data with Salesforce CRM. The whole team chose Message Agency as their partner to help conceptualize how Drupal and Salesforce would work. Message Agency is also an architecture of the Salesforce Suite, a set of Drupal modules that allows integration of these two powerful solutions.
There are interested students who come to the site to find things and explore. For that task, Drupal does a really good job, but when it comes to actions and customization Salesforce wins in it. This created a whole new paradigm of student’s communication and interaction.
The technique of centralizing information also provided Cornell with opportunities where each department had their individual page or site with content strategies. But before the website went live the CSEI team tested the user experience with the most trusted stakeholders: Cornell Students
The feedback which they received was overwhelming. They granted with positive reviews telling that how great and well organized the website was. Not only this but Pantheon also evaluated the site’s performance under the traffic load by providing complexity and image-heavy designThe Future
The wide raps of what Salesforce and Drupal make possible has given us a vivid idea on how the sales can be increased and raised among the marketing organizations. If you take one view away from all of the above, it should be this: there's definitely an integration that will work for your organization's needs and budget, but it might not be as efficient as integrating Salesforce and Drupal.
If you are able to get a Drupal-Salesforce integration deployed to your operation and organization, there is no doubt on the fact that you will enjoy streamlined and optimized business processes in the short and long term, thus boosting sales and also making the entire process much more comfortable and effective. The flexibility and customizability of Salesforce could prove to be troublesome when it comes to the consistency of your back-end.Conclusion
Drupal installations are all unique because of the different modules and customizations that they use, so integration has to be set up in a different manner by an expert.
If you already have a Salesforce instance set up, we'll be happy to explore the appropriate integration options. If you're new to Salesforce, we can work with your Salesforce developers to make sure your data is structured in a way that minimizes the integration effort and costs.Drupal Drupal 8 CMS Salesforce Salesforce Suite Salesforce mapping Salesforce Forms CRM Blog Type Articles Is it a good read ? On
Digital Echidna: Thoughts on all things digital: Digital Echidna Recognized as a Top Development Firm in Canada
In this article we will see how to update data models in Drupal 8, how to make the difference between model updating and content updating, how to create default content, and finally, the procedure to adopt for successful deployments to avoid surprises in a continuous integration/delivery Drupal cycle.
Before we start, I would encourage you to read the documentation of the hook hook_update_N() and to take into account all the possible impacts before writing an update.
Updating the database (executing hook updates and/or importing the configuration) is a very problematic task during a Drupal 8 deployment process, because the updating actions order of structure and data is not well defined in Drupal, and can pose several problems if not completely controlled.
It is important to differentiate between a contributed module to be published on drupal.org aimed at a wide audience, and a custom Drupal project (a set of Drupal contrib/custom modules) designed to provide a bespoke solution in response to a client’s needs. In a contributed module it is rare to have a real need to create instances of configuration/content entities, on the other hand deploying a custom Drupal project makes updating data models more complicated. In the following sections we will list all possible types of updates in Drupal 8.
The Field module allows us to add fields to bundles, we must make difference between the data structure that will be stored in the field (the static schema() method) and all the settings of the field and its storage that will be stored as a configuration. All the dependencies related to the configuration of the field are stored in the field_config configuration entity and all the dependencies related to the storage of the field are stored in the field_storage_config configuration entity. Base fields are stored by default in the entity’s base table.
Configurable fields are the fields that can be added via the UI and attached to a bundle, which can be exported and deployed. Base fields are not managed by the field_storage_config configuration entities and field_config.
To update the entity definition or its components definitions (field defintions for example if the entity is fieldable) we can implement hook_update_N(). In this hook don’t use the APIs that require a full Drupal bootstrap (e.g. database with CRUD actions, services, …), to do this type of update safely we can use the methods proposed by the contract EntityDefinitionUpdateManagerInterface (e.g. updating the entity keys, updating a basic field definition common to all bundles, …)
To be able to update existing data entities or data fields in the case of a fieldable entity following a modification of a definition we can implement hook_post_update_NAME(). In this hook you can use all the APIs you need to update your entities.
To update the schema of a simple, complex configuration (a configuration entity) or a schema defined in a hook_schema() hook, we can implement hook_update_N().
In a custom Drupal project we are often led to create custom content types or bundles of custom entities (something we do not normally do in a contributed module, and we rarely do it in an installation profile), a site building action allows us to create this type of elements which will be exported afterwards in yml files and then deployed in production using Drupal configuration manager.
A bundle definition is a configuration entity that defines the global schema, we can implement hook_update_N() to update the model in this case as I mentioned earlier. Bundles are instances that persist as a Drupal configuration and follow the same schema. To update the bundles, updated configurations must be exported using the configuration manager to be able to import them into production later. Several problems can arise:
- If we add a field to a bundle, and want to create content during the deployment for this field, using the current workflow (drush updatedb -> drush config-import) this action is not trivial, and the hook hook_post_update_NAME() can’t be used since it’s executed before the configuration import.
- The same problem can arise if we want to update fields of bundles that have existing data, the hook hook_post_update_NAME() which is designed to update the existing contents or entities will run before the configuration is imported. What is the solution for this problem? (We will look at a solution for this problem later in this article.)
Importing default content for a site is an action which is not well documented in Drupal, in a profile installation often this import is done in the hook_install() hook because always the data content have not a complex structure with levels of nested references, in some cases we can use the default content module. Overall in a module we can’t create content in a hook_install() hook, simply because when installing a module the integrity of the configuration is still not imported.
In a recent project i used the drush php-script command to execute import scripts after the (drush updatedb -> drush config-import) but this command is not always available during deployment process. The first idea that comes to mind is to subscribe to the event that is triggered after the import of the configurations to be able to create the contents that will be available for the site editors, but the use of an event is not a nice developer experience hence the introduction of a new hook hook_post_config_import_NAME() that will run once after the database updates and configuration import. Another hook hook_pre_config_import_NAME() has also been introduced to fix performance issues.A workflow that works for me
To achieve a successful Drupal deployment in continuous integration/delivery cycles using Drush, the most generic workflow that I’ve found at the moment while waiting for a deployment API in core is as follows :
- drush updatedb
- hook_update_N() : To update the definition of an entity and its components
- hook_post_update_N() : To update entities when you made an entity definition modification (entity keys, base fields, …)
- hook_pre_config_import_NAME() : CRUD operations (e.g. creating terms that will be taken as default values when importing configuration in the next step)
- drush config-import : Importing the configuration (e.g. new bundle field, creation of a new bundle, image styles, image crops, …)
- hook_post_config_import_NAME(): CRUD operations (e.g. creating contents, updating existing contents, …)
This approach works well for us, and I hope it will be useful for you. If you’ve got any suggestions for improvements, please let me know via the comments.
We do a lot of Drupal 8 migrations here at Aten. From older versions of Drupal and Wordpress, to custom SQL Server databases, to XML and JSON export files: it feels like we’ve imported content from just about every data source imaginable. Fortunately for us, the migration system in Drupal 8 is extremely powerful. It’s also complicated. Here’s a quick-start guide for getting started with your next migration to Drupal 8.
First, a caveat: we rarely perform simple one-to-one upgrades of existing websites. If that’s all you need, skip this article and check out this handbook on Drupal.org instead: Upgrading from Drupal 6 or 7 to Drupal 8.It’s Worth the Steep Learning Curve
Depending on what you’re trying to do, using the migrate system might seem more difficult than necessary. You might be considering feeds, or writing something custom. My advice is virtually always the same: learn the migrate system and use it anyway. Whether you’re importing hundreds of thousands of nodes and dozens of content types or just pulling in a collection of blog posts, migrate provides powerful features that will save you a bunch of time in the long run. Often in the short run, for that matter.Use the Drupal.org Migrate API Handbooks
Here’s a much simplified overview of the high-level steps you’ll use to set up your custom Drupal 8 migration:
- Enable the migrate module (duh).
- Install Migrate Tools to enable Drush migration commands.
- Install Migrate Extras as well. It provides a bunch of, well, extras. I’d just assume you need it.
- Create a custom module for your migration.
- Use YAML configuration files to map fields from the appropriate source, specifying process plugins for necessary transformations, to the destination. The configuration files should exist in “my_migration_module/config/install/“.
(Pro tip: you’ll probably do a lot of uninstalling and reinstalling your module to update the configuration as you build out your migrations. Use “enforced dependencies” so your YAML configurations are automatically removed from the system when your module is uninstalling, allowing them to be recreated – without conflicts – when you re-enable the module.)
- If you’re running a Drupal-to-Drupal migration, run the “migrate-upgrade” Drush command with the “--configure-only” flag to generate stub YAML configurations. Refer to this handbook for details: Upgrade Using Drush.
- Copy the generated YAML files for each desired migration into your custom module’s config/install directory, renaming them appropriately and editing as necessary. As stated above, add enforced dependencies to your YAML files to make sure they are removed if your module is uninstalled.
Process plugins are responsible for transforming source data into the appropriate format for destination fields. From correctly parsing images from text blobs, to importing content behind HTTP authentication, to merging sources into a single value, to all kinds of other transformations: process plugins are incredibly powerful. Further, you can chain process plugins together, making endless possibilities for manipulating data during migration. Process plugins are one of the most important elements of Drupal 8 migrations.
Here are a few process plugin resources:
- Migrate Process Plugins overview and resources from Drupal.org
- List of Core Migrate Process Plugins quick reference on Drupal.org
- Writing a Process Plugin guide to creating your own process plugin on Drupal.org (Pro tip: do a Google search first; the thing you’re trying to create likely already exists.)
Most of our projects are hosted on Pantheon. Storing credentials for the source production database (for example, a D7 website) in our destination website (D8) code base – in settings.php or any other file – is not secure. Don’t do that. Usually, the preferred alternative is to manually download a copy of the production database and then migrate from that. There are plenty of times, though, where we want to perform continuous, automated migrations from a production source database. Often, complex migrations require weeks or months to complete. Running daily, incremental migrations is really valuable. For those cases, use the Terminus secrets plugin to safely store source database credentials. Here’s a great how-to from Pantheon: Running Drupal 8 Data Migrations on Pantheon Through Drush.A Few More Things I Wish I’d Known
Here are a few more things I wish I had known about back when I first started helping clients migrate to Drupal 8:Text with inline images can be migrated without manually copying image directories.
It’s very common to migrate from sources that have inline images. I found a really handy process plugin that helped with this. In my case, I needed to first do a string replace to make image paths absolute. Once that was done, I ran it through the inline_images plugin. This plugin will copy the images over during the migration.body/value: - plugin: str_replace source: article_text search: /assets/images/ replace: 'https://www.example.com/assets/images/' - plugin: inline_images base: 'public://inline-images' Process plugins can be chained.
Process plugins can be chained together to accomplish some pretty crazy stuff. Sometimes I felt like I was programming in YAML. This example shows how to create taxonomy terms on the fly. Static_map allows you to map old values to new. In this case, if it doesn’t match, it gets a null value and is skipped. Finally, the entity_generate plugin creates the new taxonomy term.field_webinar_track: - plugin: static_map source: webinar_track map: old_tag_1: 'New Tag One' old_tag_2: 'New Tag One' default_value: null - plugin: skip_on_empty method: process - plugin: entity_generate bundle_key: vid bundle: webinar_track Dates can be migrated without losing your mind.
Dates can be challenging. Drupal core has the format_date plugin that allows specifying the format you are migrating from and to. You can even optionally specify the to and from time zones. In this example, we were migrating to a date range field. Date range is a single field with two values representing the start and end time. As you can see below, we target the individual values by specifying the individual value targets as ‘/’ delimited paths.field_date/value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: start_date field_date/end_value: plugin: format_date from_timezone: America/Los_Angeles from_format: 'Y-m-d H:i:s' to_format: 'Y-m-d\TH:i:s' source: end_date Files behind http auth can be copied too.
One migration required copying PDF files as the migration ran. The download plugin allows passing in Guzzle options for handling things like basic auth. This allowed the files to be copied from an http authenticated directory without the need to have the files on the local file system first.plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password Constants & temporary fields can keep things organized.
Constants are essentially variables you can use elsewhere in your YAML file. In this example, base_path and file_destination needed to be defined. Temporary fields were also used to create the exact paths needed to get the correct remote filename and destination filename. My examples use an underscore to prefix the temporary field, but that isn’t required.source: plugin: your_plugin constants: base_path: 'https://www.somedomain.com/members/pdf/' file_destination: 'private://newsletters/' _remote_filename: plugin: concat source: - constants/base_path - filename _destination_filename: plugin: concat source: - constants/file_destination - filename plugin: download source: - '@_remote_filename' - '@_destination_filename' file_exists: replace guzzle_options: auth: - username - password
This list of tips and tricks on Drupal Migrate just scratches the surface of what’s capable. Drupalize.me has some good free and paid content on the subject. Also, check out the Migrate API overview on drupal.org.Further Reading
Like I said earlier, we spend a lot of time on migrations. Here are a few more articles from the Aten blog about various aspects of running Drupal 8 migrations. Happy reading!
Entities and their methods are no longer limited to use within PHP, they are now available in Twig as well.
In this third installment of our series on conversational usability, we look at conversational design, an already well-explored area that is still burgeoning with emerging best practices.Tags: acquia drupal planet
In one of the episodes of Star Trek - “The Trouble With Tribbles” there is a graphical example of how small changes leads to monumental consequences over a short period of time.
The episode depicted the effect of new “species” on the established society, somewhat similar to the rise of the open source software and its tools in today’s technology.
Yet many of us aren’t cognizant of the reach and the influence that open source has on our personal and professional endurance.
What the initiative focuses on?
Thus to solve this awareness issue, OpenEuropa initiative was introduced. This Directorate- General for information initiative aims at strengthing and adopting open source tools and its practices in consolidating the European institutes' web presence.
In order to achieve such goals, the OpenEuproa Initiative focuses on the following activities.
- Software components licensed under EUPL-1.2
The initiative focuses on building, maintaining and releasing loosely coupled, reusable software components that are licensed under EUPL-1.2
The European Union Public License is a free software that has been created and approved by the European Commission. The goal of this license was to create open source license in 23 different languages for the European Union and also that it conforms the copyrights law of member states of the European Union.
- Open Source Strategies
The initiative also focused on building, maintaining and releasing full-fledged solutions and open source strategies for the European institutes. The special objectives of these strategies are:
Equal treatment in the procedure
Under this, the open source solutions and a proprietary solution will be assessed on an equal basis, both being evaluated on the basis of total cost of ownership, including the prevailing cost.
Contribution to communities
The commission service would actively participate in open source software communities to build a strong open source building block which is used in the commission software.
Clarifying legal aspects
For easy collaborations with open source communities, commission developers benefit from the right legal coaching and the advice on how to deal with intellectual property issues related to open source software
The strategy emphasis strongly on improved governance, increasing use of open source in the domain of ITC security and the alignment of this strategy.
- Web Services Architecture Overview
The initiative provides a high-level architecture overview of web related information systems.
Web information system or web-based information system is an information system that uses internet web technologies to deliver information and services, to users or other information systems.A software system whose main purpose is to publish and maintain data by using hypertext-based principles.
- Open Source Projects
The initiative contributed back to the upstream open source projects. Each project complies with PHP-FIG standards and adheres to best practices put forward by PHP the “right” way.
PHP and Drupal Projects released under the EUPL-1.2 license.are:
OpenEuropa Coding Standards
OpenEuropa and its components are built with a public contribution in mind. In order for all components and contributions to look and feel familiar, OpenEuropa has agreed to follow some coding standards
The code review components have been created in order to make it easier for the contributors to create new components or to modify the existing ones. The coding standards have to be reviewed and validated under OpenEuropa code review across different OpenEuropa components.Development Environment
The projects that are developed under the Open Europa initiative does not follow any development environment, there are software packages that follow it. The software packages like:Tools Required Purpose PHP YES Needed by Drush, Compose, and Tash Runner Composer YES A package manager for PHP GIT YES Version control system Drush YES CLI (command line interface) integration with Drupal ROBO YES Required by Open Europa task runner Node.js YES Required to develop OpenEuropa theme
PHP: We have heard about this word once in our lifetimes. Here the use of PHP is required by various tools which include the composer, Drush, Robo and Drupal itself.
Composer: Composer is used for managing dependencies of the PHP project. All the projects are required to use it and the plus point about this is that it has its natural integration with Drupal.org.
Git: Git is the distributed control version that is used as a foundation of OpenEuropa ecosystem.
Drush: This is the command line shell and UNIX interface scripting for Drupal and is used to interact with the Drupal website through line command.
ROBO: This is a simple PHP task runner which is inspired by Gulp and Rake that aims to automate common tasks.
Node.js: It is required for the development of OpenEuropa themes. All the development dependencies are defined under package.json.Automated Testing for functionalities
OpenEuropa requires automated tests to be written for every new feature or bugfix to ensure that the functionality will keep working as expected in the future. There are two types of test
OpenEuropa practices Behaviour Driven Development to facilitate effective communication between business and development teams. User stories should be accompanied by test scenarios that have been written in non-technical language. After the user stories are accepted it can then be used to perform automated tests to ensure the business requirement work.
If there are any pull requests that do not result in from user stories can be covered by unit tests rather than BDD user stories. The user should use the appropriate unit testing framework that is available for the programming languages in which the components are developed.Can Drupal components be tested as well?
In addition to the testing framework that comes with the Drupal core, OpenEuropa also uses Behat to describe business requirements.Behat is a test framework for behavior-driven development written in the PHP programming language.
When a PR is compelled to change the user behavior it is demanded to describe the expected end user behavior in the Behat scenario.
- Each of the user stories is accompanied by Behat scenario. The scenario provides to the project stakeholder for the acceptance testing.
- The target audience of these scenarios is stakeholders.
- Every Behat test scenario is written in domain-specific building language and should only be used to describe expected user behavior as it is specified by the stakeholders.
- Any code is written that does not directly affect the expected end-user behavior should be covered by unit tests instead.
Drupal 8 has introduced the concept of experimental modules, that are not under privacy policies or fully supported but used for testing purposes. They offer a wide range of functionalities, starting from migration till the site building.
Due to the experimental nature of these modules, OpenEuropa defined a set of policies under its components.
Minimum Stability Required
These modules follow different levels of stability by Drupal: alpha, beta, RC and stable.
In order for OpenEuropa to provide stability, the only experimental module in its beta and greater stage are allowed.
This is done because the modules that are in beta and later stage are very stable in API. Whenever API is changed, great care is taken to provide a compatibility layer.
Experimental modules in the alpha state
Although the rules state that the alpha modules are not allowed in the vanity, there is still a great potential value to the customers.
If for a technical reason or business reason the alpha module does justification to the project, alpha modules are allowed to the experiment. However, in such cases, the components will use the same labels as in the experimental modules they are using. That means if you are using the alpha module you are required to use its label as well until the related module is changed.OpenEuropa Release Cycle
OpenEuropa releases its components following semantic versioning. There would be three types of releases planned:
Incompatible API changes, very rare and planned in
Adds backward compatible manner functionalities and bug fixes
Adds backward compatibility bug/security fixes and can be deployed instantaneously. No new functionalities would be introduced.
Release Preparation and testing in Drupal
OpenEuropa Drupal components are released with the follow of Drupal 8 components and will be always tested against.
- Current stability in Drupal 8 core minor release. (n)
- Previous release Drupal 8 core minor release. (n-1)
- Development range in Drupal 8 core minor.
This allows to follow same support cycle as Drupal core and be better prepared for next minor releases as they occur.
For Drupal components, OpenEuropa team will have a support policy inspired by Drupal core:
Components support current and previous Drupal Core minor versions. New minor versions for components are made compatible with these respective core versions.
When a new minor core version (n) is supported, the support for release n-2 is dropped.
Open Source and its components have become really essential for building trust and safety around the software and web. It has been contributing to the projects, service-oriented architecture and technical governance that derive the design and development of the components.
The initiative has emerged as a lightning bolt in this dark world of “unawareness” It has not only aimed the strengths and powers of open source tools and practices but established a stronghold on the web presence.Drupal Drupal 8 CMS OpenEuropa EUPL-1.2 Open Source OpenEuropa Coding Standards OpenEuropa Release Cycle Blog Type Articles Is it a good read ? On
Drupal 8 has plenty of contributed modules to help you building a headless/decoupled web application. However, getting all those set up correctly could be a daunting task.
Understanding this is an issue that should be addressed and as mentioned previously in our blog posts of this “Improving Drupal and Gatsby Integration” series we wrote and contributed two modules Toast UI Editor and Build Hooks. But there are some other needed modules: JSON-API, JSON-API Extras, Site Settings to mention a few and also a minimum configuration you should take care of to have a pleasant experience with your Drupal-Gatsby integration.jmolivas Thu, 01/31/2019 - 09:00
Since I was young, I've been an avid tennis player and fan. I still play to this day, though maybe not as much as I'd like to.
In my teens, Andre Agassi was my favorite player. I've even sported some of his infamous headbands. I also remember watching him win the Australian Open in 1995.
In 2012, I traveled to Melbourne for a Drupal event, the same week the Australian Open was going on. As a tennis fan, I was lucky enough to watch Belgium's Kim Clijsters play.
In a two-week timeframe, the site successfully welcomed tens of millions of visitors and served hundreds of millions of page views.
I'm very proud of the fact that many of the world's largest sporting events and media organizations (such as NBC Sports who host the Super Bowl and Olympics in the US) trust Acquia and Drupal as their chosen digital platform.
When the world is watching an event, there is no room for error!Team Tennis Australia, Acquia and Avanade after the men’s singles final.
Many thanks to the round-the-clock efforts from Acquia's team in Asia Pacific, as well as our partners at Avanade!