Politics in the Deep Web: A case study for political intelligence
It’s an election year and campaign season is in full swing. Commercials are on every television and radio station, polling cold-calls are being made, and rallies are creating buzz. The world of politics is large, competitive, and fast-paced. Real-time updates and information published on non-indexed websites can make it virtually impossible to stay on top of political research. Even with the best and brightest campaign team, rumors on the newswire or competitive intelligence can easily fly under the radar.
BrightPlanet CASE STUDY: A listening platform for political brand management and competitive intelligence
The campaign of a U.S. Senator and potential V.P candidate was looking for a “listening platform” to monitor mentions of the senator and other politicians across the Deep Web, media websites, social media, and other targeted websites. Additionally, the campaign wanted to track every change made to opponents’ websites. They required a scalable solution to handle the large volume of daily mentions across a variety of mediums. The campaign required an open analytic platform, allowing nearly any analytic solution to “plug-and-play” with the data set.
BrightPlanet created a Deep Web Content Silo to harvest, curate, and analyze any mention of the political targets in a searchable, topic-specific repository. This specific Deep Web Content Silo boasts over 150,000 individual documents mentioning the senator after one year of daily harvesting; however, it’s not uncommon for a Silo to hold millions of documents at a time.
For example, the campaign wanted to monitor the U.S. Senate race in Nevada between Rep. Shelley Berkley (D) and Sen. Dean Heller (R). For this particular race, they monitored social media mentions, content from all Nevada newspapers, Nevada blogs, and national political blogs. Beyond typical news monitoring and social media name-drops, the campaign tracked any new YouTube videos, Flickr photos, or Facebook event changes mentioning Rep. Berkley. BrightPlanet customized a daily report for campaign headquarters, based off the Deep Web Content Silo, highlighting important new documents and key metrics; an example of this report can be viewed here.
BrightPlanet also stores copies of each webpage every time that page is harvested, tracking any changes. This feature allowed the campaign to track every change to every webpage within Rep. Berkley’s website. Any retraction or addition within the website would be harvested and archived.
Within the Deep Web Content Silo, results were isolated and analyzed using drill-down filtering, custom facets, entity tagging, topic clustering, and link analysis. Using the OpenPlanet Platform for analytics, the campaign could analyze the 150,000 documents using nearly any analytic tool.
Documents could be filtered by both source type (Deep Web, News, Blog, Facebook, Twitter) and political affiliation (Liberal, Neutral, Conservative). Additional entities tagged in every document included: people, companies, places, topics, PACs, issues, representatives, and senators.
BrightPlanet’s Deep Web tools help you stay ahead
Unlike some solutions which give a predetermined pool of data to search within, BrightPlanet’s scalable technologies allow the user to customize which sources and at what scale they want to monitor. BrightPlanet’s tools automate your research efforts, freeing up labor hours and reducing expenses. Never again will you miss critical details that could put you ahead at the polls.
- Near real-time intelligence on your candidate and competitors
- Track EVERY change on EVERY page of competitors websites
- Monitor any content, from nearly any Web source YOU define, in any language
- Flag individuals and locations for social media monitoring
- Drill-down faceted searching quickly isolates targeted content in Deep Web Content Silos
- Open analytic platform allows customizable output
The Deep Web is home to hidden content that could serve as a turning point for your campaign. Let us help you navigate.