Twitter, search robots get welcomes from Obama White House
- 22 January, 2009 12:35
President Barack Obama's stated plan to create a "Google for government" began Tuesday with a WhiteHouse.gov makeover that was announced via a blog entry on the redesigned Web site and a Twitter post.
The Web-site transition occurred when Obama officially took over the presidency from predecessor George W. Bush. The Twitter announcement said "Change has come to WhiteHouse.gov" and pointed online readers to the blog post written by Macon Phillips, the Obama administration's director of new media at the White House.
The @TheWhitehouse Twitter account was set up by the Bush administration, which posted more than 1,500 entries there. But the account never attracted much of a following until now. On Monday, it had only about 3,800 followers. By this evening, just a little over 24 hours since Obama's inauguration, the number of followers had topped 14,000 and was seemingly climbing with each page refresh.
Twitter is a familiar communications tool for the new administration. The Obama campaign's Twitter account still has more than 144,000 followers, the largest number for any account on the microblogging site, according to statistics posted on the Twitterholic.com Web site. That's despite the fact that only two new entries have been posted on the BarackObama page since Nov. 5.
The Obama administration may have also used the WhiteHouse.gov redesign to let the tech community know that its plan for a Google-enabled government is under way.
As a senator from Illinois, Obama sought to make federal data searchable and usable in tools such as mashups. His ideas include putting data in standard formats and making the information accessible through RSS feeds and other methods. That is similar to what Vivek Kundra, the District of Columbia's chief technology officer and one of the top candidates for the position of federal CTO under Obama, has accomplished with the DC government's Data Catalog, which makes a variety of municipal data available via the Web.
New York-based blogger Jason Kottke noted in a post yesterday that since the transfer of power to Obama, the robots.txt file on the WhiteHouse.gov site had been changed significantly. Search engines rely on robot programs to index content, and robots.txt is used to set limits on what is indexed. The Bush administration's "disallow" listings, which went on for 2,400 lines, were removed by the new administration and replaced by a new robots.txt file that appears to have just a single entry.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
Cloud debate now about speed and sophistication
Yahoo Mail still down for some users, after an attempted fix
Queensland government to provide 200 services online by 2015
CIOs need to get their house in order, CFO panel says
Is Data Complexity Blinding Your IT Decision-Making?
Eight Simple Steps to Boost Campaign Results Using Predictive Modelling
Marketers today are consumed by big data, struggling to find meaning and under pressure to use that meaningful data in smart ways to boost results. But many organizations are reluctant to try and use predictive modelling in their campaigns, due to unfamiliarity and the dependence on complex tools – yet with modern, marketing-friendly modelling tools, integrated with campaign management, it is easier than you think. This whitepaper demonstrates how predictive modelling plays a critical role in streamlining the selection process.
Challenges & Opportunities for Government Data Management in Australia
From almost every angle, the message for Australian government bodies is as clear as it is for their private sector counterparts: do more with less. Effective data management policies are often the best ‘unrealised’ opportunities to directly address these high-level challenges, especially when it comes to government data custodianship.
Robust Data Protection Solutions for Virtual Environments
Organisations face a juggling act with the need to improve backup and recovery, increase server virtualization, manage data growth, while remaining in operation. Virtualization has complicated the protection landscape, as protecting virtual environments can be a challenge, especially as VMs are quickly and easily created, moved, and deleted in data centres and in the cloud. This white paper explores how new backup systems have been invigorated with future-proof functionality aimed at today’s virtualized environments, offering the backup “fountain of youth”.