Although creating, searching for, updating, and publishing content in Drupal is a snap, understanding and making decisions based on that content can be challenging. Questions like, "What are the most viewed, untranslated case studies?" or "Does an accelerated blogging cadence increase page views?" are difficult or impossible to answer within Drupal alone.
Address Field is a handy Drupal module--spun out of Commerce--that provides a field able to be attached to any Drupal entity. It stores addresses in a standard format and provides a reasonable level of integration with the rest of Drupal (for example, with Views or the Entity API). Though most useful for providing billing or shipping during Commerce checkout, it's also more than adequate for any situation where an address needs to be entered or displayed (e.g. a RedHen contact or a user profile).
One of the more attractive features of Address Field is its ability to dynamically swap, re-order, and re-label its component fields based on a country's norms. For instance, while in the United States, postal codes are known as a "zip codes" and there is a defined list of 50+ states, the United Kingdom refers to them as "postcodes" and calls its administrative divisions "counties." Along the same lines, Brazil tends to write its addresses with a postal code prior to its administrative divisions. Address Field is aware of many of these differences and handles them gracefully.
Though this feature provides a nice user experience for international site visitors, some peculiarities in Address Field, as well as in the way core Drupal builds and processes forms, prevent this feature from scaling well when dealing with high volumes of unauthenticated traffic.
From the smallest personal blogs to the largest enterprise websites, the ability to understand current, recent, and historical states of the site's health and its users' activities and experiences is paramount to ensuring stability and end-user satisfaction. Drupal core comes packaged with two useful modules logging across two dimensions: Database logging (also known as dblog or watchdog) and Statistics.
For extremely small websites, these modules are adequate in providing the information necessary to make basic decisions. However, there are fundamental limitations that prevent more in-depth analysis, discovery, and informed decision-making.
This article will walk through the problems presented by logging in Drupal and highlight a Splunk-based path to enlightenment.
Several weeks ago, I released the Better Statistics module. Its initial release was intended to simply keep track of cache hits and misses, allowing analysis of the effectiveness of this patch (and this related module) on increasing cache hit rate in Drupal's default cache implementation.
Yesterday, I released into the 7.x-1.x-dev branch a major feature: a Statistics API, allowing any module to declare fields of its own and track their values on each page request.
Most anyone who's worked with Drupal for even a short period of time knows that the performance, relevancy, and scalability of Drupal's core search is very limited. As a result, and no doubt due in part to its open source DNA, the Drupal community has embraced Apache Solr as core search's de facto replacement for medium-to-large scale installations. A plethora of contributed modules have grown around the main module, and Acquia offers a very nice hosted Solr solution.
For organizations with enterprise needs, an attractive, proprietary search alternative is Google Search Appliance. Whereas Solr operates entirely via an HTTP-based API for indexing and serving results, GSA is a hardware solution hosted within an organization's network that crawls webpages and documents and indexes and ranks them in a manner similar to how google.com does the Internet. Search results can be served directly from the Appliance itself--customized via XSLT transforms--or, like Solr, can be retrieved via GET requests and parsed and served elsewhere.
Rather than debating the merits of Solr vs. GSA, this article delves into how to go about integrating Google Search Appliance with Drupal 7. This guide will focus on maximizing your or your organization's ROI, in particular by delving into user search experience improvements.
Though "landing page" is often thrown around in casual conversation to mean "any page a user first lands on," the more precise marketing definition usually refers to a page specifically linked to by an advertisement, the goal of which is to get the user to perform an action such as filling out a form, downloading a file, etc.
Hopefully, almost all Drupal devs and site maintainers are familiar with Drupal's page caching mechanism, but I'll provide a brief high-level overview for the uninitiated and those in need of a refresher:
Page cache overview
Drupal dynamically generates all of its pages, weaving through innumerable hooks, alters, and preprocess functions to finally print out a formatted HTML page. The whole process can be slow and resource intensive, so Drupal offers page caching to mitigate the performance issues.
For all but the simplest of websites, a major version upgrade of Drupal is nothing short of daunting. More than likely, you're currently using modules that don't have stable 7.x releases, or perhaps there's no official upgrade path to the 7.x version. Expect to spend a decent chunk of time "massaging" data through the upgrade process.
We're only halfway through upgrading a very large Drupal 6 site (and I plan a full rundown or series when it's complete), but I have a few nuggets of wisdom that may be valuable to those about to embark on the same journey.
Recently at the ol' day job, we got word through our website bugs alias that there was a duplicate version of the website located at a different, garbage URL. Fairly straightforward: someone was mirroring our website via something like a CNAME record and we needed to redirect, at the server level, all traffic from that URL to the correct domain name. More interesting, though, was that at the same time, all of the links embedded within our RSS feed were pointed to the same junk URL.
I'm pleased to announce the beta release of a new module called Flush Page Cache. While it's an extremely useful utility module, I sometimes liken it to the age-old rituals of blowing on a Nintendo cartridge or hitting the side of the TV to get things working again. What it actually does is provide site administrators a handy button that flushes all of the caches for a single page. So if you have a panel that has a view that lists some nodes and you edit one of those nodes, this allows you to quickly and easily flush the cache for the panel page and see fresh content, rather than having to wait for the panel's cache to expire naturally. Operators of larger Drupal sites know that flushing the entire cache can feel like waiting on the tarmac at JFK.