jeudi 31 octobre 2013

Frazzled Facebook is only partly functional

Summary: The world's most popular social network was having major troubles this East coast morning.
UPDATED: 12:20 PM EST: A day without Facebook may not be a day without sunshine, but for its over one billion users, Facebook not working properly on the US East coast morning of October 21st was a major annoyance. After several hours of working fitfully, the site is now back up.
FacebookdownchartFacebook is having serious trouble on Monday morning.
This isn't the first time that Facebook has failed, but it was the first time in over a year that the site has had a major disruption.
This service failure started at approximately 7:45 AM ET. While part of the social network was functional for some users, many people reported that they'd been unable to read, post, or comment to their own or their friends' status pages.
The Web site Down For Everyone or just Me reported that the site was indeed down for everyone. The similar site, Down Right Now, which gives more detail about Web site problems, reported that Facebook is showing a "Likely Service Disruption." Facebook's troubles do not appear to be part of a larger Internet problem. The Internet Traffic Report was showing normal global Internet traffic.
At 10:55 AM, some users began reporting that Facebook seems to have start functioning normally. While Facebook did not comment during the disruption, a Facebook represenative told ZDNet afterwards that, "Earlier this morning, while performing some network maintenance, we experienced an issue that prevented some users from posting to Facebook for a brief period of time. We resolved the issue quickly, and we are now back to 100 percent. We're sorry for any inconvenience we may have caused."
Related Stories:
Topics: Networking, Social Enterprise Steven J. Vaughan-Nichols
Steven J. Vaughan-Nichols, aka sjvn, has been writing about technology and the business of technology since CP/M-80 was the cutting edge PC operating system. SJVN covers networking, Linux, open source, and operating systems.
Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

CA crushes it in Q2 earnings, raises outlook

Summary: The U.S. software company has a banner quarter.

ca-technologies-logo-sq-med

CA Technologies on Thursday reported second quarter fiscal 2014 earnings of $0.86 on revenue of $1.14 billion, easily topping estimates of earnings of $0.73 per share on revenues of $1.10 billion. 

The software company was so pleased with its performance that it raised its full-year guidance for revenue (now between $4.47 billion and $4.52 billion) and earnings per share (now between $2.96 and $3.03 in non-GAAP terms).

The company's stock was up 3 percent in after-hours trading.

"To drive growth at CA we are investing in our business," chief executive Mike Gregoire said. "In the second half of the fiscal year we will increase our research and development spend and accelerate our investment in marketing."

ca-2q14-chart01

The company recorded an uptick in second quarter bookings, primarily thanks to higher mainframe renewals. That growth was offset by a decrease in Mainframe and Enterprise Solutions new product sales and Services engagements. In the second quarter, CA executed 10 license agreements worth $232 million.

ca-2q14-chart02

The company was also able to reduce operating expenses through lower personnel costs within selling and marketing and a decrease in commissions.

ca-2q14-chart03

CA ended the quarter with $2.799 billion in cash and equivalents on hand and $1.779 billion in debt.

Andrew Nusca is a writer-editor for ZDNet, contributor to CNET and the editor of SmartPlanet, ZDNet's sister site about innovation. He is based in New York.

MacBook goes All-Pro with 1TB SSD and 16GB RAM

During Apple's fall event on Tuesday (CNET live blog, ZDNET Apple coverage) the company announced a pair of zippy new MacBook Pros with Intel's fourth-generation processors (Haswell and Crystal Well) inside. While the new Intel silicon (combined with Mavericks) are destined to increase battery life, it's not Haswell that's got me excited. It's the new 1TB PCIe-based, flash storage. 

Apple was the first OEM to ship PCIe flash drives in the Mid-2013 MacBook Air this June and the new drives are 45 percent faster than the previous SATA III flash-based models, and nine times faster than a hard drive. Apple claims that the PCIe-based flash drives in the new MacBook Pro are 60 percent faster than previous generation MBPs.

In June AnandTech benchmarked the new MacBook Air SSD with peak sequential read/write performance of nearly 800MB/s, so it's reasonable to expect that the new MacBook Pros will benchmark in the same ballpark - or better. 

But for me, the best part is the brand-new 1TB SSD option:

mbp-late-2013-1tb-pcie-ssd

Sure, it'll set you back another $500 (over the stock 512GB variety) but for me that's huge. It means that, for the first time, I'll be able to store my massive iTunes and Aperture libraries on my internal drive and not have to constantly connect external drives every time I want to work with my music or photos. 

In the previous (Early 2013) MacBook Pro the largest SSD option available from Apple was a 768GB model, so the extra space will be a welcome addition to pros that work with a lot of media files. 

Another welcome addition is the optional 16GB of RAM in the 13-inch model:

mbp-late-2013-16gb-ddr3l-sdram

The February 2013 MacBook Pro 13-inch maxed out at 8GB of RAM, and you could only Configure To Order (CTO) the 15-inch model 16GB. Now both the 13 and 15-inch models are able to be equipped with 16GB of RAM for an additional $200. Since the RAM on Apple notebooks is soldered onto the motherboard and not able to be upgraded, you need to order it with 16GB of RAM at purchase time.

With a 1TB PCIe-based SSD and 16GB of RAM the new 13-inch MacBook Pro is a creative force to be reckoned with and sets the bar even higher for professional-grade notebooks. Once I added AppleCare (another requirement in my book) and sales tax, my 13-inch rMBP set me back $2,940.36. It would have crossed the $3k barrier if I would have upgraded to the 2.8GHz core i7, but the large SSD and extra RAM are much better value propositions for me.

After you get over the sticker shock, the other downside is that the 1TB MacBook Pros aren't being stocked at Apple retail stores and must be special ordered (either online or through your business representative) and that will add an extra "1-3 business days" longer to your ship time. 

What's your ideal MacBook setup? Did you order one?

MacBook Air with OS X Mavericks: Like getting a new system

MBA OSX(Image: James Kendrick/ZDNet)

I bought a MacBook Air when first announced due to the Haswell technology. It was a good purchase as the MacBook Air was faster than a MacBook Pro without Haswell, while getting tremendous battery life. 

This week Apple released OS X Mavericks as a free upgrade and promised many things, among them longer battery life. An OS upgrade that can extend time away from a power outlet is hard to pass up so like many I upgraded both the MacBook Air and MacBook Pro.

This upgrade has been nothing short of phenomenal on the Air. Apple wasn't kidding when it said battery life would improve with Mavericks, but that isn't the best part. My battery life is indeed increased to 10+ hours, an hour longer than it was prior to the OS upgrade. That alone is fantastic but it's not the primary benefit.

The MacBook Air was already fast, even faster than my MacBook Pro with a processor with double the speed (pre-Haswell). I have never felt the Air was slow or laggy in any respect.

With Mavericks installed the MacBook Air is faster across the board. Everything I do is instantaneous and really fast. It is such a big difference that it is noticeable the entire time I am using it. To see a performance boost of this type would normally only be seen with a hardware refresh.

One area in particular that is faster is the new Safari browser in Mavericks. I have been a long-time Chrome user but have been using Safari since the upgrade. I have been blown away how fast the new Safari is in Mavericks. Everything I do in the browser happens instantly. It is now much faster than Chrome, and I never felt Chrome was slow. It is obvious that Safari has been optimized for Mavericks.

It's not just me, friend and colleague Adrian Kingsley-Hughes (@the_pc_doc) confirmed on Twitter he's seeing the same fast speed in Safari.

I am very happy with OS X Mavericks. The faster performance coupled with the longer battery life is like getting a new system. That this could be done solely through an OS upgrade is outstanding, and kudos to Apple for making this happen.

I recently upgraded my ThinkPad Tablet 2 to Windows 8.1 and see a marked improvement as a result. As good as that upgrade has been for system improvement, it's nothing close to what I see with OS X Mavericks. 

What I'm seeing on the MacBook Air is not a one-off fluke, I am also seeing consistent performance gains on my MacBook Pro with the Retina Display. That system doesn't have Haswell inside, so Mavericks is boosting older systems, too.

OS X Mavericks is not just a new version, it's like getting a new computer.

See also:

New MacBook Air: Haswell ups the game (review)

OS X Mavericks: What a modern OS upgrade should feel like

Windows 8.1: Makes the ThinkPad Tablet 2 even better

Microsoft WebApps: Mobile Web sites in Windows Phone app's clothing

Summary: Microsoft is packaging up some popular mobile Web sites and making them available as downloadable Windows Phone apps.
There are some interesting new "apps" published by Microsoft showing up in the Windows Phone store, as of late.
mswinphonewebapps
These apps, known as Microsoft WebApps, are Web sites packaged up in mobile-app form. They are free and downloadable from the Windows Phone Store. (Thanks to Will Dreiling for the pointer.)
Among the nearly 50 apps in the "WebApps" group are apps for Southwest Airlines, the Food Network, 1-800 Flowers, TMZ, Orbitz, J.Crew, and CarMax.
The WebApps team is part of the Windows Store team, I hear. The WebApps team is different from the Microsoft Publisher Account, which is the team that makes available official Microsoft apps for Windows Phone.
I asked Microsoft what the WebApps team is doing and why. A spokesperson sent me the following statement:
"We are helping people access great mobile experiences on Windows Phone by creating pinnable Web Apps that show up in the app list. These are not a replacement for native apps. In most cases we hope that usage of the Web App will encourage the ISV to publish its own native app."
It looks like WebApps are yet another way Microsoft is hoping to encourage developers to build more brand-name, popular Windows Phone apps. I'm not against this tactic. On my Surface RT, I have nearly as many pinned Web sites on my Start screen as I do native apps. Sometimes, I've found a Web app to be as good, if not better, than the native app (example: New York Times).
As of mid-2013, there were approximately 160,000 apps in the Windows Phone store.
Microsoft is believed to be building a unified Windows-Windows Phone Store, but it may not be available until the spring of 2015.
Topics: Mobility, Microsoft, Mobile OS, Software Development, Web development, Windows Phone Mary Jo Foley
Mary Jo has covered the tech industry for more than 25 years for a variety of publications and Web sites, and is a frequent guest on radio, TV and podcasts, speaking about all things Microsoft-related. She is the author of Microsoft 2.0: How Microsoft plans to stay relevant in the post-Gates era (John Wiley & Sons, 2008).
Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

mercredi 30 octobre 2013

OwnCloud Documents to bring Open Document Format editing to private cloud

Summary: OwnCloud, an open-source Infrastructure as a Service (IaaS) cloud vendor, is readying an Open Document Format editor to go with its open-source, private-cloud software.

Does the world need another cloud-based, online editor when we already have Google Docs, Office 365, and Zoho Docs? OwnCloud says that we do with its forthcoming ownCloud Documents, and they have several compelling arguments.

ownCloudDocsOwnCloud is bringing ODF group text editing to its open-source, private cloud offering.

First, ownCloud is primarily as an Infrastructure as a Service (IaaS) cloud program. With it you can store your files, folders, contacts, photo galleries, calendars and more on a server of your choosing, and then access all this from a mobile device, a desktop, or a web browser. You can also sync your date with local devices and share your data either with the world at large, or specific approved users.

In most ways, it's like most public IaaS services such as Google Drive, Dropbox and SkyDrive. However, since it's an easy-to-deploy private cloud service, you, and not some third-party, have ultimate control of your documents.

In addition ownCloud is one of the first cloud services to support Open Document Format (ODF) editing. Others, such as Office 365 and Zoho, can import and export to ODF, but ownCloud uses ODF as its native format.

Put all this together and, according to Frank Karlitschek, ownCloud founder, what you get is: "collaborative editing! This feature is implemented in an app called "ownCloud Documents" and will be part of ownCloud 6. People can view and edit their ODF text documents directly in the browser, inside your ownCloud. Another cool thing is that you can invite users from the same ownCloud to work collaboratively on the same document with you. Or you can send invitation links by email to people outside your server to collaborate with you on the document."

OwnCloud and KO GmbH, a software development company specializing in ODF, have done this by integrating KO's WebODF with a "new ownCloud back-end to load, save, share documents and a system to distribute the document changes." WebODF is a JavaScript library for adding ODF support to Websites and mobile or desktop apps by using HTML and CSS to display ODF documents.

What gives its users, according to Karlitschek, are the following features:

It runs purely on your server. There's no communication with centralized services like Google — so your data is always protected against surveillance. It doesn't introduce any new server requirements here. Just take ownCloud and put it into your Web server document root and you have your own collaborative editing server. This is far easier to install and run than, for example, Etherpad. All the documents are based on ODF files that live in your ownCloud. This means that you can sync your documents to your desktop and open them with LibreOffice, Calligra, OpenOffice or MS Office 2013 in parallel. Or you can access them via WebDAV if you want. You also get all the other ownCloud features like versioning, encryption, undelete and so on. All the code is completely free software. The PHP and the JavaScript components are released under the AGPL license. This is different than most other solutions. Some of them claim to be open source but use creative commons as a code license which is not free software.

That said, Karlitschek admits that ownCloud is still having teething pains. "This is only the first version of this great feature. Not every ODF element is supported but we are working on improving this considerably in the future. We will invest significantly in this because we think that this is a very important feature that is useful for people."

OpenOffice and LibreOffice have both been working on making their office suites available as a cloud services. Neither, however, have yet shipped a cloud-capable version of their program. I expect both will within the next two quarters.

So, if you like the idea of having your own cloud-editing service that you control and is built on open source, then you should check it out. OwnCloud Documents is part of the ownCloud 6 beta 1, which is available for download now.

Related Stories:

Topics: Enterprise Software, Cloud, Linux, Open Source, Software

Steven J. Vaughan-Nichols

Steven J. Vaughan-Nichols, aka sjvn, has been writing about technology and the business of technology since CP/M-80 was the cutting edge PC operating system. SJVN covers networking, Linux, open source, and operating systems.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Apple's iCloud cracked: Lack of two-factor authentication allows remote data download

DSC04465(Image: Violet Blue/ZDNet)
KUALA LUMPUR, MALAYSIA — Russian security researcher Vladimir Katalov analyzed Apple's secretive iCloud and Find My Phone protocols to discover that neither are protected by two-factor authentication, and iCloud data can be downloaded remotely without a user ever knowing.
In "Cracking and Analyzing Apple’s iCloud Protocols," presented to a crowded room at Hack In The Box security conference last Thursday in Kuala Lumpr, Malaysia, Vladimir Katalov revealed that user information and data is not as inaccessible as Apple is telling the public.
Katalov's findings appear to support his emphatic statement that Apple can access data it claims to not be able to access.
A malicious attacker only needs an Apple ID and password to perform remote iCloud backups — and do not need the user's linked devices.
He explained that there is no way for a user to encrypt their iCloud backups.
The data is encrypted, he explained, but the keys are stored with the data. Katalov added that Apple holds the encryption keys.
Katalov told ZDNet he was shocked to discover that in addition to all of these security chain issues, Apple's iCloud data is stored on Microsoft and Amazon servers.
Katalov's presentation pointed out that because Apple provides full request information to its third-party storage providers (Amazon and Microsoft), Apple could provide this data to law enforcement.
In Apple's July public statement on the NSA PRISM surveillance program, Apple denied any backdoor server access for government agencies. Apple unequivocally stated, "Apple does not give law enforcement access to its servers."
When a user performs an iCloud backup download, they receive an email informing the user that the process is complete.
"Apple does not give law enforcement access to its servers." — Apple, July 2013
Katalov discovered that when a remote download is performed, the user receives no notification email. If a user's data is accessed and downloaded from iCloud by a remote third party, they would not know.
Katalov's work represents the first time anyone has analyzed and publicly presented findings on Apple's secretive iCloud protocol.
Vladimir Katalov analyzed Apple's iCloud and Find My Phone protocols by sniffing http traffic on jailbroken devices — though he was careful to explain that a user's devices do not need to be jailbroken for a malicious entity to exploit the remote backup protocol security omissions Katalov discovered.
Analyzing the traffic, he told the crowded room during his Thursday presentation, was not difficult.
Apple's iCloud data is comprised of what a user stores as a data backup. It contains documents, Dropbox files and sensitive user data.
In his analysis, Katalov discovered that iCloud files are stored as a container — plist
and content — in a files-to-chunks mapping schema.
But he found that Apple's two-factor authentication, a layer of user security used in addition to a username and password, is not used for iCloud backups (or Find My Phone).
DSC04475(Image: Violet Blue/ZDNet)
Apple's two-step authentication ("2FA") does not protect iCloud backups, Find My Phone data and the documents stored in the cloud. Katalov details this further in a blog post: "Apple Two-Factor Authentication and the iCloud."
Katalov showed Hack In The Box attendees that with simple queries, it's possible to get the authentication token for accessing the iCloud backup, backup IDs, and the encryption keys. Then one can download the files from where they're stored in Windows Azure or Amazon AWS.
ZDNet caught up with Katalov after his presentation to find out more.
When asked if he had presented his discoveries to Apple, he explained that his findings were the results of protocol analysis — and are not a vulnerability.
Put another way, the iCloud security hole falls into the "it's a feature, not a bug" category.
DSC04471(Image: Violet Blue/ZDNet)
When ZDNet asked Katalov if there was a way for Apple to fix this issue — such as extending two-factor authentication to its iCloud and Find My Phone services — he shook his head and told us that Apple's implementation of two-factor auth was likely "only an afterthought."
Katalov told ZDNet the best thing a user can to do to protect their iCloud data is to simply not use iCloud.
However, Katalov told us he still uses Apple's iCloud as a backup service. "It is not exactly safe, but I am selecting between security and privacy," he said.
It's easy to argue that because a remote attacker needs an Apple user ID and password, the data is still out of reach to most malicious entities.
However, obtaining Apple user IDs and passwords isn't impossible — aside from email phishing techniques, which are more effective than most would believe. Social engineering techniques are sadly common and also very effective.
A recent example is the spate of Apple ID data thefts in Norway. This past February, a significant number of teenage girls were targeted by boys who easily surmised the girl's user ID and password recovery information to gain access to their Apple accounts, download photos and the girls' data — which, sadly, ended up pass around and also sold online.
In his Hack In The Box presentation, Katalov told the audience that he was also surprised to discover that when a user shuts off location tracking services, the user's location is still stored for around 3-6 hours.
We wondered if this is what led Katalov to mention that next he will analyze Touch ID protocol and storage — as soon as iOS 7 is jailbroken, he told ZDNet.
"Apple says it never sends the information, and it is never copied to local [storage]" he added, "but I am not so sure."
ZDNet asked why Katalov felt this way, when Apple specifically states that it does not transmit Touch ID information.
Katalov's eyes glittered, and a boyish smile crept across his face. In his thick Russian accent he replied, "Trust no one."
ZDNet has contacted Apple for comment and will update this article if Apple responds.
Related stories:

IHS: Smartphone, tablet factory revenue to surpass entire consumer tech market

Summary: Booming smartphone and tablet sales are doing a lot more than just pushing traditional PCs to the side.

manufacturing-factory

The supply chain for mobile devices alone is poised to become more valuable than factories serving the entire consumer electronics market, based on a new report from IHS iSuppli.

According to the market research firm on Friday, revenue from OEM factories worldwide that produced "media and PC tablets" and "3G/4G cellphones" (which analysts defined as " a category dominated by smartphones) will ring up to $354.3 billion by the end of 2013.

But factories for consumer tech at large is only expected to deliver $344.4 billion in revenue this year. If that happens, it would mark the first time ever that global factory revenue for mobile devices alone surpass the rest of the spectrum.

Randy Lawson, a senior principal analyst covering semiconductors at IHS, hinted in the report that these findings shouldn't be all that surprising by remarking "sales growth for CE products has languished in the doldrums."

Just think about how many gadgets have been combined and then replaced by the smartphone: pocket cameras, handheld audio and video recorders, and MP3 players, just to name a few.

He continued:

The fact that these two product categories are on their own able to generate more OEM factory revenue than the entire CE market illustrates the overwhelming popularity of smartphones and tablets. Meanwhile, the CE market has gone flat, with many of the major product types experiencing either low growth or declines in revenue during the past six years.

IHS analysts asserted this trend will continue -- albeit the pace of growth might slow down in 2014, with revenue forecasted to increase by only 18 percent to $418.6 billion next year.

Rachel King is a staff writer for ZDNet based in San Francisco.

Only 39 percent of IT projects successful? That's a good start

Another day, another survey announcing disconnect between IT and the business. A new report from Forrester shows high levels of dissatisfaction on both sides, and suggests more "integrated thinking" is needed.

US Census National Processing Center 2 photo from US Department of CommercePhoto: US Department of Commerce, Census Bureau

The sound bite coming out of a new survey of 474 IT executives conducted by Forrester Consulting on behalf of EffectiveUI is that only 39% believe their internal IT organizations have the ability to regularly deliver projects on time and on budget.

There's actually nothing new in this finding -- in fact, 39% probably is pretty optimistic compared to other studies done over the years, such as Standish Group's Chaos report, which suggests that only 30% of projects meet their goals.

Still, there are many areas where IT organizations don't seem to be cutting the mustard. For example, only 43% of the sample report that their IT organizations collaborate with the business on business.

Here's one that suggests the message of service-oriented architecture or enterprise architecture still hasn't resonated among a majority of enterprises yet: only 31% report that their departments maintains a clearly defined set of business-centric services that the business can easily understand."

Still, end-users aren't making things easier.  The Forrester/EffectiveUI survey suggests organizational dysfunction is what keeps sending IT projects down in flames. More than half of the IT executives, 56% say the biggest issue they encounter is users constantly changing requirements on projects in midstream. Half of the IT leaders also say their departments are overburdened, and they end up "trying to do too much at once." More than one-third, 34%, believe they lack clear executive direction, while another 34% point to a lack of the right development talent. A similar number, 32%, cite a lack of stakeholder consensus.

As one respondent put it: “The business folks don't think in terms of what capabilities are nice to have and what are must-haves, and they often give a list of requirements that's too high-level. This doesn't help IT get an accurate sense of how technology can help.”

At the same time, business’ satisfaction with IT is lower than 50%. Despite this low level of satisfaction, only 25% of IT decision-makers place top priority on updating and modernizing key legacy applications, and only 20% believe mobile to be of top importance on their list of priorities.

Enterprise customer-facing apps don't get very high marks from their own creators as well. Only 20% of IT decision-makers surveyed said they were "very satisfied with the user experience of the customer-facing Web applications that are created in-house," and only 14% were very satisfied with their customer-facing mobile applications.

Okay, there's already too much talk about the "lack of IT-business alignment" that seems to be everywhere. What is needed is common-sense, roll-up-your-sleeves collaboration. The two sides inevitably keep getting closer together, because for many organizations, IT has become the business, and the business has become IT.  Plus, members of GenX and GenY keep moving into positions of responsibility, and they were raised on computers.

But, ultimately, technology is only a tool -- it by itself won't put function into a dysfunctional organization.

The Forrester report isn't just a gripe sheet. The report's authors recommend that internal development teams should strive to understand the big picture and take responsibility for not only the creation of the application, but also the impact it has on the business. Here are their three recommendations for increasing collaboration between business and IT:

"Take back responsibility for great applications with integrated thinking." Such thinking requires application development teams to understand the big picture and take responsibility not just for building the applications but for their positive (or negative) impact on the business."Task integrated design with people who understand both business and technology." Great design, the report's authors say, "comes from people who can enumerate the options, try decisions on for size, and make informed choices that fit within design constraints, but that are consistent with the vision for the final product." Don't treat it as just another process. "Measure every design decision based on its impact on user experience."

(Thumbnail photo: James Martin/CNET)

Microsoft makes Surface docking station available in limited quantities

When it launched its second-generation Surface tablets and new peripherals, Microsoft told users not to expect the new Surface Power Cover or docking station until early 2014. 

surfacedock

Something seemingly changed. The docking station is available now. (I'm not sure how long it will be, or how much stock is available, but if you really want it, hurry.)

I was alerted to the dock's early availability by reader Aaron Craig, a sys admin for risk-management company Bickmore. He ordered four Surface docking stations on October 22 and had them delivered on October 24. (He sent me pictures to prove it.)

I just checked on Surface.com and Microsoft's online store site and also see the docking stations are available for order, with the option for next-day delivery.

I've asked Microsoft if this is just a temporary situation or if the docks arrived earlier than expected. No word back so far.

Update: Windows SuperSite's Paul Thurrott told me that Microsoft execs said docking stations would be available ahead of 2014 but only in limited quantities, which I hadn't heard. So, again, if you want one sooner rather than later, it's probably best to hurry. 

The Surface docking station, which costs $199.99, allows users with the original Surface Pro or the Surface Pro 2 to dock their tablets (with keyboards attached). The docking station includes a display port, audio input and output jack, an Ethernet port, as well as a high-speed USB 3.0 and three USB 2.0 ports.

Here's one shot Craig sent me of one of his just-acquired docking stations:

aaroncraigdock

The Power Cover for the Surface 2, Surface Pro and Surface Pro 2 is still not available for order yet and is still designated as "coming early 2014" on Microsoft's Surface.com site.

Microsoft officials said the company sold $400 million worth of its first-generation Surface tablets in the most recent fiscal quarter, which ended on September 30, 2013. They also said they sold double the number of Surfaces than they did in the previous calendar quarter, but we have no way of knowing how many that means, as Microsoft hasn't released Surface sales data (and it has sold quite a number of Surfaces at a discount). 

Microsoft officials also said during the company's earnings call on October 24 that demand for Surface RT units was stronger than the company expected. They noted that a number of potential Surface Pro purchasers held off on buying devices in anticipation of the Intel Haswell-based Surface Pro 2's arrival.

Microsoft began making its second-generation Surfaces, the ARM-based Surface 2 and the Intel-based Surface Pro 2, commercially available as of October 22. Microsoft also cut $100 off the price of its first-generation Surface Pro devices this week.

Update 2: That was relatively quick. As of 4 pm EST on October 25, the Microsoft Store online is showing the Surface docking station is out of stock. No word back still from Microsoft as to its ramp-up plans for supplies of the dock. (Thanks to WPCentral's Daniel Rubino for the out-of-stock pointer.)

BBM for Android and iOS released, sign up for the queue now

Summary: There haven't been a lot of lines for BlackBerry devices in the past, but for some reason there seems to be a lot of excitement for BBM on Android and iOS.
BlackBerry has teased launching BBM for iOS and Android for a while now and today is release day, as reported on BlackBerry's blog. Even though you can download it to your device, you can't use it just yet.
I enjoyed using the BlackBerry Q10 and even the Z10, but still only have about five BBM friends on the BlackBerry platform. That said, I refreshed the Google Play BBM for Android page until I was able to download it on my Note 3.
Installing and then launching BBM for Android just gets you into the queue as soon as you enter your email address. I have been in the queue for about an hour and have no idea when I will get the email giving me access to BBM so I cannot talk about my experiences with the app.
The bigger question for me is, will I finally get more than five friends on BBM? If I do, it's certain that those friends are not running it on BlackBerry hardware.
Related coverage
Matthew Miller started using a Pilot 1000 in 1997 and has been writing news, reviews, and opinion pieces ever since.

Post-PC means it's time for enterprise IT to change

consumerizationEmployees using their own kit. It's a very post-PC thing to do.

A friend of mine asked me today for some advice about a problem he needs to solve.

He has a mobile workforce, several thousand people strong. They all have tablets — let's assume the devices are iPads, although they don't need to be. Each tablet has its own cellular connection.

What my friend wants to do is control access to the web while the users are out of the office, in the same way that he does from desktop machines that are on-premises. For example, there's no Facebook, no dodgy malware-drenched sites.

It's not unusual to lock down access to the web on desktop machines. We've been locking down the web in corporate environments since the web was first invented.

But as soon as you give those users iPads and send them off into the big, wide world, achieving the same effect is much less obvious. More to the point, the fact that with post-PC devices this is much harder to achieve tells us something about how enterprise IT is changing.

These changes to enterprise IT are happening in an interesting way. Consumerization of IT, enterprise mobility, and an increased awareness of how IT systems work within society as a whole are all pushing this change. Post-PC ideas, around mobility allowing the user to break outside of the physical and temporal boundaries of work, coupled with increased appeal of cloud-based systems, come at the time when this change is happening, but is also causing the change to happen.

In essence, post-PC is both cause and effect.

The first thing to think about is that in the PC days, "lock everything down" was an obvious thing to do. The IT department worked from the perspective of controlling risk and cost as a primarily goal. This was done by a sort of gentle "mistrust" of the user base (which I don't mean in a pejorative sense). The IT department knew best, and they were in charge, and they made the decisions.

My Twitter friend Matt Ballantine describes it very well when he says the role of IT moving from a position of acting like a "utility company" where they provide services, top-down to the organization, to one where they act as counsel to the organization. By acting as counsel, they advise rather than provide, in much the same way a lawyer provides advice, rather than getting their hands dirty.

Coming back to the original idea about locking down access to the web when the user is mobile, the old utility company way of looking at things suggests control and that locking things down is a good idea. Users should not be trusted. However, in the new "acting as counsel" way, the IT department may look at it differently. Empowered employees working in an environment where risks are respected and managed may be beneficial to the business for all sorts of reasons.

This shift to acting as counsel goes hand in glove with the move to post-PC -- i.e. it's how the IT department needs to behave for everyone to get the most out of it.

My friend who needs to lock down web access for thousands of tablets has found that there are no good solutions to his problem. And this brings us onto the second change that we're experiencing as enterprise IT moves into the post-PC era.

Technically, there seems to be only two solutions available for this. First one is to reverse a proxy server so that devices are follow my mobile device management (MDM)-forced policy to route all traffic from the cellular network to an outward facing proxy server, into the corporate network, and then back out into the world if the proxy server policy allows it. (Ugly.) The second one is to force a VPN connection -- but VPN works poorly in mobile scenarios because it's designed around good fixed connections rather than flaky mobile ones.

I suspect a lot of you have thrown-up a little bit just reading about those solutions. They are not lovely!

(As a side note, if you have any good ways to solve this, please do chime in the comments!)

In pre-post-PC enterprise-land, difficult, non-obvious things were fun. But in post-PC enterprise-land, we're acting more as counsel, and as part of that, we're expecting to involve non-technologists within the business to find solutions and bring them to us for advice. These will often be cloud-based, software-as-a-service-style solutions.

More to the point, the solutions will generally be very lightweight, cheap, and easy. Implementing them should be a process of checking boxes and long, lazy, self-congratulatory lunches.

To be properly post-PC any solution for locking down web access should be obvious and easy. But it isn't obvious and easy, ergo it isn't something that we're supposed to be doing. 

This may seem like tortuous logic, but let me see if I can make the thought process a little smoother.

The fall of the PC started with the desire for people to have better relationships with the people and things that they love. As it moves into the enterprise, post-PC is supposed to help support their work as well by improving relationships with colleagues, customers, and partners.

That involves listening to non-IT colleagues outside the IT department who identify tools that will help them, and listen to end-users who know how they want them to work.

This whole thing is very fluid, light, adaptable, and adapting. Complex and expensive, big-ticket, consultancy-led IT projects don't really fit into that model. Neither does top-down control and "mistrust" (or however you want to label it).

My friend, for his specific organizational needs, need to lock down his devices for reasons I need to keep private here -- but for rest of us, things are very much changing as enterprise IT moves into the post-PC era.

If it's not light and fluffy -- and if you feel like a technician when you implement it rather than a lawyer -- you're doing it wrong.

What do you think? Post a comment, or talk to me on Twitter: @mbrit.

Rhode Island integrator shores up telehealth solutions practice

As testament to its fast-growing footprint in healthcare solutions, integrator Carousel Industries of Exeter, R.I., has appointed Brian Douglas to drive business development for its healthcare and telemedicine practices. 

Douglas' background includes clinical sales expertise involving surgical and medical devices from companies such as Stryker, Kyphone and Medtronic. He also worked for both Polycom and AMD Global Telemedicine, where he was responsible for designing telemedicine systems in conjunction with technology companies including Polycom, Tandberg and Cisco.

In his new role as business development manager for public sector, Carousel Industries, Douglas said he will concentrate on evangelizing the role of telemedicine technologies as a way for healthcare providers to rethink their care-providing processes.

Today, many people associate these sorts of solutions with projects and grants focused on rural geographies and funded by state grants. But telemedicine and teleheath technologies can play a much larger role in helping hospital systems and providers handle patient care far more efficiently, Douglas said. In particular, these technologies can help organizations handle basic concerns more quickly, enabling patients to be referred to specialists as necessary and working to address the waiting periods -- or long travel distances -- that have come to be associated with certain doctors' visits. 

One Carousel Industries client that has already invested in collaborative telemedicine is Northern Georgia Health Services, a healthcare system with approximately 500 physicians and more than 5,000 employees that is based in Gainesville, Ga.

Back in 2012, the hospital teamed up with two other regional health systems along with Emory University both to handle training for nurse practioners and to bring telemedicine services to critical intensive care units. The solution, funded with an $11 million grant, includes videconferencing, instant messaging, scheduling and mobile device support, among other features. Technologies that are playing a role in the solution include the Polycom HDX7000 Media Center, Converged Management Application, RMX2000 Real Time Conferencing and the VBP 5300-E10 Video Bridge; plus Microsoft Lync Server 2010, SharePoint and InfoPath.

In the corporate sector, some companies are also experimenting with telehealth solutions at manufacturing and production locations. Douglas cites the examples of oil rig operators and a Caterpillar manufacturing site in Texas. 

Either way, Carousel's background in both infrastructure and compelling applied solutions positions it to participate in the telehealth movement.

“Carousel has a clear advantage in its ability to provide solutions for the public sector, considering its vast expertise in networking, data, security and other critical areas," Douglas said when he was appointed. "These technical disciplines represent the foundation of our offering, and I look forward to working with each of our product managers to create extremely targeted solution bundles for our customers."

mardi 29 octobre 2013

Apple dreams of an iPad cloud for enterprise zombies

ipadzombies5

All Hallow's Eve is soon to be upon us, dear readers. And there's nothing spookier or more bone-chilling than a middle-aged tech writer trying to force bad horror film analogies into yet another iPad launch post-game analysis.

I watched, of course, along with everyone else, the fall Apple launch event which among other things, brought us two shiny new tablets, the iPad Air and the long-awaited iPad mini with Retina Display.

I'm not going to go into the purely spec-oriented and technical aspects of the devices, as well as an analysis of what this might mean for the competition's offerings in the consumer market. That would be a repetitious waste of time.

This has already been written about ad-nauseum by our own resident vampires and all the other ghouls who have already successfully leeched the life out of your grubby mouse-clicking fingers in order to give them drops of your precious pageviews.

Frankly, I'm not interested in the consumer market. What the kiddies and the pond scum do with their torture toys has no bearing really in what I write about, and frankly, neither should it matter to any IT decision maker or anyone who has to deal with line of business applications in a large enterprise. 

And unless you've been a troll sleeping under an old stone bridge, you're probably aware there's a trend to move those lines of business applications increasingly towards the cloud. Clouds which will not only host enterprise applications and data but also provide services in the form of APIs which devices will consume.

Earlier this summer I paid some attention to what this service-oriented landscape currently looks like, mostly from the consumer perspective. And the more I look at it, the more I realize that Apple's service-oriented strategy is increasingly mirroring its developer ecosystem: a walled garden in a creepy castle.

Cupertino is going to need really tall plants to keep the zombies from escaping.

Sure, lots of people currently bring iPads to work. They use messaging and calendaring services through the iPad's excellent (licensed) Exchange connectivity and they connect to web applications as well as critical line of business Windows applications through Citrix and now even through Microsoft's native RDS.

And while Apple doesn't provide these tools themselves, there are excellent corporate MDM solutions for managing iOS devices, from a number for industry players, including Cisco, Citrix, Microsoft and Good Technology.

Today, the iPad is an active participant in the on-premises world, most of it due to enterprises and 3rd-party vendors having to do the heavy lifting to accommodate them and create work-arounds for a device that is not inherently tailored for business. But just how long is that going to persist for?

We know the future of line of business applications is not going to be strictly on-premises applications, and it is going to be with clouds and SaaS -- more immediately we're going to see a transition towards hybridized, "mashup" type scenarios where organizations pick best-of-breed SaaS and web services living at different cloud providers and mix it with data providers on and off-premises.

So while the iPad lives comfortably within the enterprise as a tolerated squatter today, the future is not so certain. Apple has already shown from its most recent display of "free" software bravado that it wants productivty users to use iCloud and iWork, as opposed to Office or other alternatives. 

The kicking and blood curdling screams from Apple's user base have already started.

While Apple has shown essentially zero interest in creating a canvas for enterprise users, leaving this to the developers to fill the void, it will eventually become intolerant to other parties stepping in on their limited squishy turf.

As we know from history, the company is an absolute control-freak when it comes to the end-user experience and will not permit "duplication of functionality" and anything else they can shove into their Developer Agreement in order to protect that creepy walled garden. 

Apple tolerates Amazon, Google's and Microsoft's apps which use their own respective cloud services on iOS today. But we know that this could change at any time if Apple feels their position is threatened in any way.

If the tone of Tim Cook's comments during the first moments of his opening speech at Apple's most recent product launch is of any indication, the company absolutely does have considerable insecurities about its competitors moving into their space.  

Cook, a former IBMer, should know better. Enterprises aren't consumers. They don't like to be told by vendors what they can and cannot do and they hate having restrictions imposed on them. They want their data to be portable, they hate lock-in, and they may have their own special requirements that may prevent them from using a one size fits all Cloud.

As Apple faces more competition from the companies that actually know how to run public clouds that cater directly to the enterprise -- Amazon, Microsoft, Google and IBM, as well as from other providers which will create competitive or specialized cloud services -- the value of Apple's DNA-bottlenecked platform and ecosystem diminishes. 

For Apple to have its devices and its services not be handicapped within the enterprise, they need to embrace standards for interoperability and data portability, as well as an ongoing willingness to play nice with other cloud providers, a subject that I touched upon two years ago but is becoming much more of a concern today. 

I don't foresee Apple playing nice in a cloud and service-oriented world. But hey, enterprises. Take your chances. Trick or Treat!

Forgotten in the high-density datacenter: IBM

Summary: New server company Servergy uses IBM Power Architecture to equip the high-density datacenter

While it’s hard to imagine a company the size of IBM flying under the radar, it is very rare to hear mention of it when discussing the public face of the future of high-density datacenter computing. ARM vs. Atom seems to dominate the discussion, with combatants and their supporters trumpeting each new product announcement and technology advance. And in the background, IBM’s Power Architecture has been quietly ramping up.

Offering microserver-sized units with full scale processing power, a new Texas company called Servergy announced yesterday its new Cleantech line of servers built around IBM’s Power Architecture and optimized for power and performance — offering what they describe as the next generation of hyper-efficient servers. The individual CTS-1000 servers are less than 1/4U in size, (Servergy compares the server footprint to that of a legal pad of paper) and weigh in at 9 lbs. but offer features and performance usually found in much larger servers.

ctsCTS-1000 server

The heart of the server is a Power Architecture System on a chip (SoC) running 8 cores at 1.5 GHz, up to 2 MB of L3 cache, two 10G and two 1G Ethernet ports, 32 GB of main system memory, and hardware offload engines for security/encryption, networking, and pattern matching. OS support is for a range of Linux distributions. Because of the small size of the server, they can increase server density by up to 400 percent when compared with traditional 1U servers.

Drawing less than 130 watts at full load, Synergy claims that due to their ground up re-engineering of the basic concepts for the design of servers of this nature they can see a reduction of up to 80 percent across power, cooling, and carbon footprints when compared to other Linux PowerServer implementations.

Synergy is currently the only company, other than IBM, offering Linux-on Power Servers. IBM is pushing the technology hard, having announced in September its commitment to $1 billion in funding for Linux and open-source technologies for the Power architecture.

Topics: Data Centers, IBM, Linux

David Chernicoff

With more than 20 years of published writings about technology, as well as industry stints as everything from a database developer to CTO, David Chernicoff has earned the term "veteran" in the technology world.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

2 keyboards for iPad Air: ZAGGkeys Folio and ZAGGkeys Cover

Summary: We told you it wouldn't be long until keyboards arrived for the iPad Air and these two look pretty nice.

ZAGGkeys Folio CoverZAGGkeys Folio, Cover (Image: ZAGG)

Apple made the iPad Air much thinner and lighter than the previous generation iPad and that means new keyboards are in order. We predicted they would start showing up soon and we were right. ZAGG, maker of a line of keyboards for the iPad and iPad mini, has announced two new models for the iPad Air.

The ZAGGkeys Folio and Cover are similar in design to two models for the iPad mini that were reviewed by ZDNet. The Folio is a case that completely covers the iPad Air while providing a Bluetooth keyboard. 

See related: Two keyboard cases for iPad mini from ZAGG change the game | 9 best iPad keyboards (hands on): March 2013 | Definitive guide to keyboards for iPad and iPad mini

Like the two keyboards for the iPad mini, both the new Folio and Cover have backlit keys with selectable colors and brightness. They also both have a clever hinge that allows adjusting the iPad Air to a variety of viewing angles to fit the situation.

The ZAGGkeys Cover is the thinner of the two keyboards as it only covers the screen of the iPad Air. The Cover model is my favorite for the iPad mini due to its thin form when attached to the tablet.

The ZAGGkeys Folio shows to be shipping now for $99.99 and the Cover, also $99.99, should ship in November.

James Kendrick has been using mobile devices since they weighed 30 pounds, and has been sharing his insights on mobile technology for almost that long.

Netflix tops 40 million members, credits multiple viewing platforms

Summary: Netflix said it will double its original content investment, which will still account for less than 10 percent of the overall content spending.

Netflix's strategy to drop its service in multiple distribution channels such as smart TVs, AppleTV, Roku, Google's Chromecast, tablets and smartphones is paying off nicely as the company has landed more than 40 million subscribers.

The company reported third quarter earnings of 52 cents a share on revenue of $1.1 billion. Wall Street was looking for earnings of 49 cents a share. Netflix's outlook for the fourth quarter was above expectations of 46 cents a share, but had a wide range from 47 cents a share to 73 cents a share.

netflix q3 financials

Netflix's shareholder letter had two common themes: Original content and distribution matters. On the original content front, Netflix touted its Emmy for House of Cards and noted Orange is the New Black has good buzz and viewership. The original content investment has worked so well that Netflix said it will double its investment, which will still account for less than 10 percent of the overall content spending.

As for distribution, Netflix indicated that smarter television is helping. In a shareholder letter, CEO Reid Hoffman and CFO David Wells said:

The growth of smart TVs and Internet TV devices, such as AppleTV, Roku, and Chromecast, are increasing the availability of TV streaming platforms. Tablets and phones also are rapidly growing as Netflix viewing platforms.

Netflix also aims to integrate with cable set-top boxes.

We are open to more of these integrations with cable set-tops around the world, but given the fragmented technology footprints, we think it will be many years before cable set-top boxes match Internet set-top boxes for Netflix streaming volume. As a general rule, we’re happy to support devices from other video providers as long as we get application placement commensurate with our popularity.

netflix q4 outlook Topics: E-Commerce, Cloud, Mobility

Larry Dignan

Larry Dignan is Editor in Chief of ZDNet and SmartPlanet as well as Editorial Director of ZDNet's sister site TechRepublic.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Propelics carves niche in enterprise mobile apps

As the concept of downloading mobile applications from a central "store" continues to become status quo in the consumer world, developer and solution provider Propelics is bringing the same idea into enterprise organizations with large mobile workforces that are looking for a better way to manage -- and personalize -- the software that their employees are using.

The San Jose, Calif.-based company, established about two years ago, was formed by a team of IT strategists with extensive experience in customer relationship management (CRM) and enterprise resource planning (ERP) solutions. They were increasingly pulled into mobile engagements starting in 2008, and that's when they decided enterprises could benefit just as much from application "stores" and respositories as individuals, said Eric Carlson, one of the company's partners.

"We're looking for use cases that change processes and take advantage of a device on the employee side, which is extremely powerful," he said. "We're looking to change how they work with others."

To be clear: the applications that Propelics is developing aren't merely extensions of existing desktop applications. The company is focused moreso on rethinking existing manual processes that are probably automated today with paper -- or, maybe, not at all.

Carlson cites an example from one client, a large services firm (he can't share the name) with more than 300 delivery stations. Propelics dispatched some of its team members on "ride alongs" so that they could observe drivers on the job. During that process, it discovered that replacement drivers -- those deployed temporarily on the route -- took an extra four hours to get the job done. This was because they didn't have access to the shortcuts of the regular drivers, in terms of who to contact upon arrival or even different routes to take. 

"We realized there was $6 million to $7 million in overtime based on drivers not having street knowledge," Carlson said.

Based on that finding, Propelics proposed an application filled with tips and information about different routes to help streamline the process for replacement drivers. The cost to develop it was about $75,000, but it helped the company save far more than that amount, he said. 

Propelics lists some very high-profile companies on its customer list, including Payless Shoesource, Wells Fargo, Merck, Family Dollar and Hallmark. Its approach is finding particular resonance with companies that are managing a large number of retail locations, Carlson said. That's because the organization can test different ideas in pilot sites and then make them available to the other locations via an app store, as appropriate.

The company currently employs about 30 mobile specialists but expects to double in size over the next 12 months, Carlson said. 

lundi 28 octobre 2013

U.S. plans 'tech surge' to fix balky health exchanges

Summary: The web-based, federally-run insurance exchanges went online on October 1, but they have been beset with problems.

health-exchanges-usa-102113-med

The U.S. Department of Health and Human Services said this weekend that it will solicit outside assistance to fix some of the technical problems that have affected the rollout of the federally-run insurance exchanges that came online on October 1.

The exchanges are core to the Affordable Care Act—colloquially, "Obamacare." 

"The experience on HealthCare.gov has been frustrating for many Americans," the department wrote in a blog post on its website. "Some have had trouble creating accounts and logging in to the site, while others have received confusing error messages, or had to wait for slow page loads or forms that failed to respond in a timely fashion. The initial consumer experience of HealthCare.gov has not lived up to the expectations of the American people. We are committed to doing better."

The department characterized the initiative to fix the exchanges as a "tech surge," referencing the rapid increase of U.S. military personnel in Afghanistan in 2009.

"Our team is bringing in some of the best and brightest from both inside and outside government to scrub in with the team and help improve HealthCare.gov," it wrote. "We're also putting in place tools and processes to aggressively monitor and identify parts of HealthCare.gov where individuals are encountering errors or having difficulty using the site, so we can prioritize and fix them. We are also defining new test processes to prevent new issues from cropping up as we improve the overall service and deploying fixes to the site during off-peak hours on a regular basis."

Andrew Nusca is a writer-editor for ZDNet, contributor to CNET and the editor of SmartPlanet, ZDNet's sister site about innovation. He is based in New York.

Experian caught up in ID theft investigation

Summary: One of the three major consumer credit bureaus is under investigation by the US Secret Service for selling personal data to an ID theft ring.

Security researcher Brian Krebs has uncovered the involvement of credit bureau Experian in an ID theft operation.

Experian, an information services company best-known as one of the three major consumer credit bureaus, became involved through their March, 2012 acquisition of Court Ventures.

Through research, Krebs demonstrated that Court Ventures had sold data to Superget.info, a "fraudster-friendly" site which marketed the ability to look up personally-identifiable information on millions of Americans.

Krebs cites an interview with Marc Martin, the CEO of another information services company which had a relationship with Court Ventures.  Martin tells of a US Secret Service investigation of Experian related to ID theft and the data sold to Superget.info.

Individuals at Superget.info had presented themselves to Court Ventures as US-based investigators and gained access to Experian data. In fact, they were based in Vietnam, and the individuals have a history of involvement in ID theft.

Experian has also been in the news recently as the agency which performs credit history checks for the troubled government site healthcare.gov.

Larry Seltzer has long been a recognized expert in technology, with a focus on mobile technology and security in recent years

Is the IT worker stereotype true? Somewhat

Summary: White, male, middle aged -- sound familiar? While the stereotype is still largely true, the demographics are changing.

wsj-it-worker-profile-620x350-scrnChart detail courtesy WSJ

The Wall Street Journal has a new visual graphic out profiling the information technology worker. It's an interesting look at how the stereotype matches up with reality.

Sure, the majority of IT workers are still male—about 74 percent of them, according to this graphic. And seven out of every 10 still report themselves as white. 

But there are some shifts.

About one-third of all IT workers have degrees in business, social sciences or other non-technical fields. (In fact, about one-fourth don't have a college degree at all.)

Texas is now the number two U.S. state for IT employment, behind only California.

More than a third are less than 34 years old, which means they began to enter the workforce the same year that the film Office Space came out.

And they are paid not nearly as much as some think they are, given all the buzz around the narrower, more lucrative field of software development: half of all network and systems administrators pull in less than $72,000 per year, and half of all support specialists make less than $46,000 per year. 

No reason to fret, though: Jobs for systems administrators are expected to grow 28 percent through 2020, twice as fast as all other occupations. Which means businesses still find lots of value in the occupation.

Andrew Nusca is a writer-editor for ZDNet, contributor to CNET and the editor of SmartPlanet, ZDNet's sister site about innovation. He is based in New York.

dimanche 27 octobre 2013

Sao Paulo begins technology-driven plan to improve the transport system

The Government of the city of Sao Paulo will use open data and a multitude of technicians trained to try and improve its precarious public transport network.

São Paulo metro network is small, but efficient with only 74km of track compared to 337 miles of track in New York City. As a result, the roads are filled with bus - and owned by auto in Brazil rose 32 per cent to 7.4 million during the last decade, which partly explains why Sao Paulo often is nicknamed "the city of 19 million traffic jams".

To try to address these issues of acute urban mobility, the Mayor will be promoting a hackathon this weekend (26 and 27 October) for developing applications focused on the provision of public transport in São Paulo.

Participants will follow a list of requests and needs of the users of public transport that have been identified by the Government; criteria include creativity and technical quality of applications created with the weekend. There will be prizes in money of 8,000 reais ($3.674), 4,000 reais ($1.837) and R $ 3,000 ($1.377) for the three best proposals.

Applications will be created using the data made available by SPTrans, Sao Paulo's public transportation authority, since last week. This is significant because until now, Google has only had access to that information, which includes routes of buses, timetables and positioning in real time of the bus.

Any user, Brazilian or foreigner, can register to use the data after accepting a simple set of guidelines that can be found here.

My own idea - crowdsourcing bus maps in Sao Paulo

More stops in Sao Paulo are simple shelters or wooden posts only with no information about the itineraries or timetables. With this in mind, I have tried to introduce the idea of crowdsourcing a visual representation of itineraries through the design of London bus maps - commonly known as"spider".

The idea was presented to the management Gilberto Kassab exactly two years ago and I have worked on that for almost six months. The plan was to get a company ready to assemble a map with various social features free construction system, so the citizens of São Paulo use blocks lego-estilo based in London spider maps, to create a map of bus routes, add points of interest in the surroundings, construction to measure routes according to interest and so on. The introduced information could be edited by other users, Wikipedia-style.

These maps would then be available online as well as physical maps available at the bus stops. As well as getting that crowdsourcing the access to information, will be the reward for citizens see credits on maps, as designers and publishers who obtain credits.

In our plan, the company supplies the construction of map system you would get some kind of printed maps of marketing support, but real carrot was the celebrity to run a major civic project that could later be referenced.

My husband and I then requested permission from Transport for London, the transportation authority in London to use the design of the map, which was given free of charge to Sao Paulo and Mayor Boris Johnson thought that the collaboration was a great idea, so we thought that we had enough backing to get an idea of the Earth.

5884212024_163e46eb38A typical bus stop in Sao Paulo. Photo credit: Mark Hillary (cc)

After presenting the idea, we had several meetings with the Government of the city of São Paulo, transportation body the SPTrans city and the British Consulate, who represented the Mayor of London in the debates.

While the body transport marketing thought it would be a perfect way to change the perceptions of people in the administration of public transport, empowering management of SPTrans talked about the complexity of system in Sao Paulo - 10 million more trips bus every day, several individual transport routes thousand routes change frequently...

At the beginning of 2012, we give up on the project when it became clear that some of these actors would not allow that we proceed. We ask three things:

1. Information about routes of bus, route numbers and so on. This was something that was provided to Google, with people of SPTrans almost exclusively dedicated to this task, on a weekly basis;

2 Agreement forward and to launch a tender to find a company willing to build and organize the system - the RFI we have submitted (in Portuguese) can be found here;

3. Marketing of the initiative through the channels of communications from the public transport network, so that people interested in helping.

At the time, it seemed impossible to get detailed to release the same data were providing to Google, due to "security issues". Then made one last attempt to save the work which had made during those six months and suggested that for all open data bus and promote a hackathon among high schoolers at State technical schools. We have all heard from the marketing department at that time was that the team was not willing to help.

Interestingly, in all these meetings with government officials, the recurring question was: "What is in it for you?". They could not believe that my husband and I would devote time, contacts and experience without any monetary compensation.

The answer has always been that since got the idea about the intent it was to help city - and not only the owners of internet-based smartphones, but all bus users, print maps of crowdsourcing can be used anywhere in Sao Paulo.

So the efforts of the new administration for public transport open data and engage citizens were good news to hear, despite my previous experience. Now, no one will be able to create something valuable for everyone without endless meetings, internal policy and bureaucracy. And that is real progress.

The Apple move that could nullify one of Android's biggest advantages

Summary: Android has leapfrogged iPhone in the quality of its virtual keyboard experience, thanks to SwiftKey. But, Apple has a move it could make.

apple-campus-sign-600pxImage: Jason Hiner

I've been carrying both an iPhone and Android phone for four years, and in 2013 one of Android's biggest usability advantages over iPhone is SwiftKey. However, there's a relatively quick way that Apple could negate that advantage, and we've got confirmation that it's possible.

In April I wrote that Android's two killer features that are making it better to use than iPhone are Google Now and Swiftkey. That has held true throughout 2013.

The Google Now advantage was somewhat blunted when Google released Google Now on iOS this spring. Unfortunately, Google Now is tucked inside the Google Search app on iOS and it's not nearly as powerful without Android's excellent notification system (the other big Android advantage I've written about).

When a lot of people think of SwiftKey they think of SwiftKey Flow. That's the gesture feature that lets you swipe across the letters of the virtual keyboard without lifting your finger from the surface and then SwiftKey magically translates it into the word you were creating based on its algorithm and what it learns from your patterns.

swype-keyboard-4.3-300pxThe SwiftKey keyboard includes adaptive intelligence. Image: SwiftKey

However, there's a lot more to SwiftKey than that. In fact, I know plenty of users who love SwiftKey but never use SwiftKey Flow. Some of SwiftKey's other key features include:

Predictive text - As you type, SwiftKey gives you three choices for words that you may be typing and you can simply touch one of those to complete the word, such as you enter "ext" and it offers "extremely." Even better, it also predicts the next word you're type, based on common phrases and your history. For example, if I type "Jason" it automatically offers "Hiner" before I even start typing the next word. It's a great time-saver.Adaptive autocomplete - The other great thing about the way SwiftKey does autocomplete is that it's adaptive. It learns the words you use and it can even (with your permission) look at your text message history or Gmail inbox to learn more about the words, phrases, and jargon you use frequently.Automatic spaces - When you select a word from SwiftKey's predictive text or autocomplete it also automatically adds a space after it. That may sound inconsequential, but not having to add spaces saves time that quickly adds up (it also enables SwiftKey's "Flow Through Space" feature that lets you do one gesture to enter an entire sentence without lifting your finger from the keyboard). SwiftKey also automatically removes spaces between a word and a period, question mark, or exclamation mark to end a sentence. This is so useful that I take it for granted and expect the iPhone to do it when I switch back and forth between the two platforms.SwiftKey Cloud - If you have multiple Android devices using SwiftKey then you can share your SwiftKey profile between them so that it can share all the intelligence that it's learning about you across devices. It's also handy if you get a new device. That way SwiftKey doesn't have to re-learn your habits, jargon, and patterns all over again.

The combination of all those things makes entering text on Android a much more efficient and nuanced experience than iPhone (as long as you pay the four dollars for the SwiftKey app on Android).

However, SwiftKey doesn't just sell the world's most-downloaded Android keyboard app. It also licenses its SDK to phone makers who want to improve their software keyboards. SwiftKey CMO Joe Braidwood said that between 10-20 companies currently license SwiftKey's technology or are in the process of trialing it, including car manufacturers and companies working on wearable technology. Most of the deals are confidential but a few companies have publicly stated that they use SwiftKey's technology, including Samsung, which uses it in the default keyboard in its Galaxy line of devices, and Vizio, which uses it in the tablets it has released as companion devices to its TVs.

Notably, the SwiftKey SDK is not just limited to Android. It was widely reported last year that BlackBerry 10 uses SwiftKey as the basis of its on-screen keyboard. While neither BlackBerry nor SwiftKey have confirmed it, the evidence is conclusive. Of course, both Android and BlackBerry 10 are Linux-based operating systems so you could argue that's not much of a stretch.

Tapping M2M: The Internet of Things

Tapping M2M: The Internet of Things

The rise of objects that connect themselves to the internet -- from cars to heart monitors to stoplights -- is unleashing a wave of new possibilities for data gathering, predictive analytics, and IT automation. We discuss how to tap these nascent solutions.

But, I asked Braidwood if the SwiftKey SDK could potentially work on dissimilar platforms such as Windows Phone and iOS and he confirmed that it could since the SDK is in C++. In fact, he said it would be particularly straightforward to integrate with iOS since it is based on Objective-C. So, I asked him directly if SwiftKey would be willing to work with Apple and he said SwiftKey would certainly be open to it.

Apple should make it happen.

There are four big advantages that Android devices currently have over the iPhone (especially for power users): larger screens, notifications, Google Now, and the SwiftKey keyboard. Larger screens will likely have to wait until the iPhone 6 next fall (or an Apple phablet). A better notification system will likely have to wait until iOS 8. A Google Now equivalent will also likely have to wait until iOS 8 when Apple can integrate newly-acquired Cue.

In the short term, that leaves SwiftKey as the best opportunity for Apple to nullify one of Android's most important advantages. If SwiftKey would be as straightforward to integrate as Braidwood suggests, then it could be something Apple might integrate into a point release of iOS 7.

Lately, Apple has been more likely to acquire small companies and then integrate their technology into Apple products than it has been to license technology. So, a case could be made that Apple's strongest move could be to acquire SwiftKey, or its closest competitor Swype (which is part of Nuance). Apple also has a ton of its own software engineers and it could simply put them to work on emulating many of the same keyboard features, if it hasn't already.

Whatever direction it chooses, Apple need to act decisively to improve its virtual keyboard. It has fallen behind Android in this area, which is critical for professional users. It's to the point that when BlackBerry holdouts who love their hardware keyboards are now choosing between Android and iPhone, I typically recommend Android to that crowd because the keyboard experience is that much better.

Topics: Mobility, Apple

Jason Hiner

Jason Hiner is the Editor in Chief of TechRepublic. He writes about the products, people, and ideas that are revolutionizing business with technology.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

São Paulo starts tech-driven plan to improve transport system

The Government of the city of Sao Paulo will use open data and a multitude of technicians trained to try and improve its precarious public transport network.

São Paulo metro network is small, but efficient with only 74km of track compared to 337 miles of track in New York City. As a result, the roads are filled with bus - and owned by auto in Brazil rose 32 per cent to 7.4 million during the last decade, which partly explains why Sao Paulo often is nicknamed "the city of 19 million traffic jams".

To try to address these issues of acute urban mobility, the Mayor will be promoting a hackathon this weekend (26 and 27 October) for developing applications focused on the provision of public transport in São Paulo.

Participants will follow a list of requests and needs of the users of public transport that have been identified by the Government; criteria include creativity and technical quality of applications created with the weekend. There will be prizes in money of 8,000 reais ($3.674), 4,000 reais ($1.837) and R $ 3,000 ($1.377) for the three best proposals.

Applications will be created using the data made available by SPTrans, Sao Paulo's public transportation authority, since last week. This is significant because until now, Google has only had access to that information, which includes routes of buses, timetables and positioning in real time of the bus.

Any user, Brazilian or foreigner, can register to use the data after accepting a simple set of guidelines that can be found here.

My own idea - crowdsourcing bus maps in Sao Paulo

More stops in Sao Paulo are simple shelters or wooden posts only with no information about the itineraries or timetables. With this in mind, I have tried to introduce the idea of crowdsourcing a visual representation of itineraries through the design of London bus maps - commonly known as"spider".

The idea was presented to the management Gilberto Kassab exactly two years ago and I have worked on that for almost six months. The plan was to get a company ready to assemble a map with various social features free construction system, so the citizens of São Paulo use blocks lego-estilo based in London spider maps, to create a map of bus routes, add points of interest in the surroundings, construction to measure routes according to interest and so on. The introduced information could be edited by other users, Wikipedia-style.

These maps would then be available online as well as physical maps available at the bus stops. As well as getting that crowdsourcing the access to information, will be the reward for citizens see credits on maps, as designers and publishers who obtain credits.

In our plan, the company supplies the construction of map system you would get some kind of printed maps of marketing support, but real carrot was the celebrity to run a major civic project that could later be referenced.

My husband and I then requested permission from Transport for London, the transportation authority in London to use the design of the map, which was given free of charge to Sao Paulo and Mayor Boris Johnson thought that the collaboration was a great idea, so we thought that we had enough backing to get an idea of the Earth.

5884212024_163e46eb38A typical bus stop in Sao Paulo. Photo credit: Mark Hillary (cc)

After presenting the idea, we had several meetings with the Government of the city of São Paulo, transportation body the SPTrans city and the British Consulate, who represented the Mayor of London in the debates.

While the body transport marketing thought it would be a perfect way to change the perceptions of people in the administration of public transport, empowering management of SPTrans talked about the complexity of system in Sao Paulo - 10 million more trips bus every day, several individual transport routes thousand routes change frequently...

At the beginning of 2012, we give up on the project when it became clear that some of these actors would not allow that we proceed. We ask three things:

1. Information about routes of bus, route numbers and so on. This was something that was provided to Google, with people of SPTrans almost exclusively dedicated to this task, on a weekly basis;

2 Agreement forward and to launch a tender to find a company willing to build and organize the system - the RFI we have submitted (in Portuguese) can be found here;

3. Marketing of the initiative through the channels of communications from the public transport network, so that people interested in helping.

At the time, it seemed impossible to get detailed to release the same data were providing to Google, due to "security issues". Then made one last attempt to save the work which had made during those six months and suggested that for all open data bus and promote a hackathon among high schoolers at State technical schools. We have all heard from the marketing department at that time was that the team was not willing to help.

Interestingly, in all these meetings with government officials, the recurring question was: "What is in it for you?". They could not believe that my husband and I would devote time, contacts and experience without any monetary compensation.

The answer has always been that since got the idea about the intent it was to help city - and not only the owners of internet-based smartphones, but all bus users, print maps of crowdsourcing can be used anywhere in Sao Paulo.

So the efforts of the new administration for public transport open data and engage citizens were good news to hear, despite my previous experience. Now, no one will be able to create something valuable for everyone without endless meetings, internal policy and bureaucracy. And that is real progress.

Microsoft Office 365 and Google Apps face off in DoD contract

The U.S. Army Program Executive Office for Enterprise Information Systems (PEO EIS) has approved for purchase 50,000 seats each Microsoft's and Google's respective cloud-office offerings.

Microsoft acknowledged the deal in an October 21 blog post. I asked Google officials about the deal and have yet to hear back.

Update: A Google spokesperson also acknowledged that Google was awarded 50,000 seats as part of the deal.

peoeis

Neither Microsoft (along with its bidding partner Dell) nor Google is actually getting paid for any of these seats until they start selling them into commands. Today's announcement is the commencement of yet another contest between the two office-service rivals.

Any U.S. Department of Defense (DoD) service or agency can go with Office 365 or Google Apps under this new Blanket Purchase Agreement (BPA) without any additional competition required. Under terms of the deal, from what I've heard from my contacts, Microsoft and Google both can sell into different commands in varying amounts, up to 50,000 seats each. 

I believe Microsoft will be fielding its Office 365 for Government SKU — the one which allows government agencies to deploy Office 365 in a controlled, locked-down environment — as its entry in this competition.

A new blog post by Microsoft's Curt Kolcun, Vice President of U.S. Public Sector, doesn't specify which Office 365 SKU Microsoft is using. Instead, it notes that Microsoft will be fielding "Cloud Services, including e-mail and calendaring, Office Web Apps, unified capabilities like Microsoft Lync, and collaboration tools like SharePoint."

Google is going to be pitching Google Apps for Government, a Google spokesperson confirmed, via the following statement:

"The U.S. Army will provide Google Apps for Government to an initial group of 50,000 Army and Department of Defense personnel. This effort is part of the U.S. Army's program to use commercial cloud services to improve collaboration, information sharing and mobile access for the men and women who serve our country. We look forward to working closely with the Army on this project."

Microsoft recently announced that Lync 2013 has been Joint Interoperability Test Command (JITC) certified, allowing DoD organizations to connect Lync 2013 to the DoD’s information network. 

Microsoft also recently announced that Windows Azure was granted a Provisional Authority to Operate (P-ATO) from the Federal Risk and Authorization Management Program (FedRAMP) Joint Authorization Board (JAB). According to Microsoft, Azure is the first public cloud platform to receive a JAB P-ATO. Microsoft is developing an Azure for Government SKU, as well, codenamed "Fairfax." 

Microsoft: Windows RT 8.1 fix still in the works

Summary: Microsoft officials say a fix to Window RT 8.1 is still in the works two days after the company pulled the operating system update from the Windows Store. Updating issues are affecting Surface RT owners only, Microsoft says.

Two days after pulling its Windows RT 8.1 from the Windows Store, seemingly due to installation issues, Microsoft is still working to fix glitches in update to the Windows RT operating system and make it again downloadable from the Windows Store.

Microsoft made commercially available both Windows RT 8.1 (for ARM-based systems) and Windows 8.1 (for Intel-based ones) starting October 17. Microsoft made it so some users who already had the Core and Pro Windows 8 on their devices could get the update for free by downloading it from the Windows Store.

surface2glitch

But from the get-go, a number of users were reporting problems finding the update and getting it to install on their Microsoft- and non-Microsoft hardware. There were reports of some Surface RT users having their machines bricked as a result of applying the update.

Microsoft pulled the update from the Store on October 19 without explaining exactly why. Company officials promised to update users about what was happening.

On October 21, the company quietly made available a Surface RT recovery image, allowing those with borked Surface RT 8.1 updates to more easily reinstall Windows RT. For those affected by the Surface RT 8.1 installation issue, here's Microsoft's guidance as to what you can do to try to recover your machine.

 On October 21, a spokesperson sent me this statement:

"Based on our investigations of a situation customers have encountered updating to Windows RT 8.1, we can confirm that as of now this is a Windows update issue only affecting Surface RT customers. While only less than 1 out of every 1,000 (or less than 0.1 percent) Surface RT customers who have installed Windows RT 8.1 have been impacted, improving their experience and ensuring their systems are fully operable as quickly as possible is our number one priority.

We have made recovery media available for download here along with actionable guidance for affected customers. We continue to work towards making the Windows RT 8.1 update available in the Windows Store again and apologize for any inconvenience. Further updates will be provided as they become available."

So even though the problem with Windows RT 8.1 seems to be particular to the Surface hardware, the 8.1 update is not available for anyone with any kind of ARM hardware at this point.

Mary Jo has covered the tech industry for more than 25 years for a variety of publications and Web sites, and is a frequent guest on radio, TV and podcasts, speaking about all things Microsoft-related. She is the author of Microsoft 2.0: How Microsoft plans to stay relevant in the post-Gates era (John Wiley & Sons, 2008).

A look at a 7,235 Exabyte world

Summary: IDC says byte density will surge and the data storage and business implications will be huge. Here are a few thoughts on the aftermath.

Raw storage capacity, known as byte density, will surge from 2,596 exabytes in 2012 to 7,235 exabytes in 2017, according to research firm IDC.

Aside from buying a bunch of big data and storage stocks, it's a bit unclear what this data surge is going to yield. The business world will either look like an episode of Hoarders or we'll glean some real insights.

IDC argues that if data is going to be come insight, organizations are going to have to prioritize, store and retrieve information easily. Things like social data will need to be used for new business models.

exabyte breakdown IDCSource: IDC

Tape and optical storage will be tossed as data moves to the cloud. IDC sees a data repository in the cloud and information will be viewed as a natural resource.

exabytes by region

It's hard to doubt IDC's data directionally. Data is growing exponentially and has to be stored somewhere. Here are a few thoughts on what IDC's storage and byte density predictions mean:

Data will have to be governed. We have generic privacy laws today, but if data is truly a natural resource like oil and gas it will have to be regulated as such. Industries will be reordered based on data and analytics. Every company will have broad big data plans. Many of these organizations will fail miserably. In many ways, analytics and data insight systems will become like enterprise resource planning applications a few decades ago. ERP had lots of promise to revamp businesses and also implementation disasters. New technologies like Hadoop will be the only way to navigate all of this data. The enterprise tech establishment will be rattled and a new pecking order will emerge. User experience will matter. The big data companies are fascinating and it's fun to watch queries come back with insight. The issue: The user interface on most big data systems keeps the technology in the hand of a few. Having exabytes of data is one thing. Giving the troops in the field insight is another issue entirely. Owning the data will be everything. The vendors that capture the most data win. Period. New business models will emerge. Data analysis and brokering will create entirely new models. Look for Facebook to be a player along with Google.exabyte countries

Larry Dignan is Editor in Chief of ZDNet and SmartPlanet as well as Editorial Director of ZDNet's sister site TechRepublic.

samedi 26 octobre 2013

Box serves up file sharing for small Mexican restaurant

Summary: When the owners of New York's Dos Toros needed a recipe for more secure, remote file sharing, they turned to the cloud.

You've probably heard the term "sneaker net." Well how about "bicycle net?" That's the primary way that New York Mexican restaurant chain, Dos Toros, used to share files and important information before investing in a Box account last year.

Dos Toros is a four-location taqueria chain with approximately 80 employees. Aleta Maxwell, the director of human resources, finance and administration, said as the company grew, it became more difficult to ensure that individual managers were using the same versions of financial information and other spreadsheets. "You've got to keep everything the same across restaurants," added Leo Kremer, co-founder and owner of Dos Toros.

Aside from using email, which is a challenge for version control, the team used to shuttle updates around on bicycles in order to make sure everyone was on the same page.

The Box service was suggested as an alternative by one of the general managers. "Definitely, the security features were a huge point for us," Maxwell said.

Using Box, Dos Toros has created an archive for its spreadsheets, point of sale and accounting information, which can be accessed -- depending on the controls set by the owners -- the information that they need to run their individual location. "We can see who is accessing and touching the different files, which is also a huge consideration," Maxwell said.

Pricing for Box starts at $45 per month for a team of three.

Topics: SMBs, Cloud, Storage

Heather Clancy

Heather Clancy is an award-winning business journalist specializing in transformative technology and innovation

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.