According to the latest numbers from Ookla’s Net Index, the United States ranks 31st among every other country for internet download speeds, and 42nd for upload speeds. The data was gathered from the average of the past 30 days of speed tests done on Seattle-based Ookla’s Speedtest.net site.
While that still puts the U.S. in the top 20 percent of countries, there’s a lot of room for improvement. As Internet-connected devices continue to drive economic growth, increasing broadband speeds to keep up with the rest of the world is key.
The expansion of fiber networks, including Google Fiber and Seattle’s effort to bring fiber connectivity to parts of the city brings the promise of improving the U.S.’s standings.
But overall, the U.S. is in a tough spot, because of its size compared to some of the other countries on the list. Bringing effective Internet infrastructure to a country that spans almost 3.8 million square miles is a much different challenge, compared to 4th place South Korea, which measures 38,691 square miles.
Click here for the graphic on internet connectivity for 186 countries in report.
NASA’s Lunar Laser Communication Demonstration (LLCD) has made history using a pulsed laser beam to transmit data over the 239,000 miles between the moon and Earth at a record-breaking download rate of 622 megabits per second (Mbps).
LLCD is NASA’s first system for two-way communication using a laser instead of radio waves. It also has demonstrated an error-free data upload rate of 20 Mbps transmitted from the primary ground station in New Mexico to the spacecraft currently orbiting the moon.
“LLCD is the first step on our roadmap toward building the next generation of space communication capability,” said Badri Younes, NASA’s deputy associate administrator for space communications and navigation (SCaN) in Washington. “We are encouraged by the results of the demonstration to this point, and we are confident we are on the right path to introduce this new capability into operational service soon.”
Since NASA first ventured into space, it has relied on radio frequency (RF) communication. However, RF is reaching its limit as demand for more data capacity continues to increase. The development and deployment of laser communications will enable NASA to extend communication capabilities such as increased image resolution and 3-D video transmission from deep space.
“The goal of LLCD is to validate and build confidence in this technology so that future missions will consider using it,” said Don Cornwell, LLCD manager at NASA’s Goddard Space Flight Center in Greenbelt, Md. “This unique ability developed by the Massachusetts Institute of Technology’s Lincoln Laboratory has incredible application possibilities.”
LLCD is a short-duration experiment and the precursor to NASA’s long-duration demonstration, the Laser Communications Relay Demonstration (LCRD). LCRD is a part of the agency’s Technology Demonstration Missions Program, which is working to develop crosscutting technology capable of operating in the rigors of space. It is scheduled to launch in 2017.
LLCD is hosted aboard NASA’s Lunar Atmosphere and Dust Environment Explorer (LADEE), launched in September from NASA’s Wallops Flight Facility on Wallops Island, Va. LADEE is a 100-day robotic mission operated by the agency’s Ames Research Center at Moffett Field, Calif. LADEE’s mission is to provide data that will help NASA determine whether dust caused the mysterious glow astronauts observed on the lunar horizon during several Apollo missions. It also will explore the moon’s atmosphere. Ames designed, developed, built, integrated and tested LADEE, and manages overall operations of the spacecraft. NASA’s Science Mission Directorate in Washington funds the LADEE mission.
The LLCD system, flight terminal and primary ground terminal at NASA’s White Sands Test Facility in Las Cruces, N.M., were developed by the Lincoln Laboratory at MIT. The Table Mountain Optical Communications Technology Laboratory operated by NASA’s Jet Propulsion Laboratory in Pasadena, Calif., is participating in the demonstration. A third ground station operated by the European Space Agency on Tenerife in the Canary Islands also will be participating in the demonstration.
For more click the source link below.
Extension of cable-based telecommunication networks requires high investments in both conurbations and rural areas. Broadband data transmission via radio relay links might help to cross rivers, motorways or nature protection areas at strategic node points, and to make network extension economically feasible. In the current issue of the nature photonics magazine, researchers present a method for wireless data transmission at a world-record rate of 100 gigabits per second.
In their record experiment, 100 gigabits of data per second were transmitted at a frequency of 237.5 GHz over a distance of 20 m in the laboratory. In previous field experiments under the “Millilink” project funded by the BMBF, rates of 40 gigabits per second and transmission distances of more than 1 km were reached. For their latest world record, the scientists applied a photonic method to generate the radio signals at the transmitter. After radio transmission, fully integrated electronic circuits were used in the receiver.
“Our project focused on integration of a broadband radio relay link into fiber-optical systems,” Professor Ingmar Kallfass says. He coordinated the “Millilink” project under a shared professorship funded by the Fraunhofer Institute for Applied Solid State Physics (IAF) and the Karlsruhe Institute of Technology (KIT). Since early 2013, he has been conducting research at Stuttgart University. “For rural areas in particular, this technology represents an inexpensive and flexible alternative to optical fiber networks, whose extension can often not be justified from an economic point of view.” Kallfass also sees applications for private homes: “At a data rate of 100 gigabits per second, it would be possible to transmit the contents of a blue-ray disk or of five DVDs between two devices by radio within two seconds only.”
In the experiments, latest photonic and electronic technologies were combined: First, the radio signals are generated by means of an optical method. Several bits are combined by so-called data symbols and transmitted at the same time. Upon transmission, the radio signals are received by active integrated electronic circuits.
The transmitter generates the radio signals by means of an ultra-broadband so-called photon mixer made by the Japanese company NTT-NEL. For this, two optical laser signals of different frequencies are superimposed on a photodiode. An electrical signal results, the frequency of which equals the frequency difference of both optical signals, here, 237.5 GHz. The millimeter-wave electrical signal is then radiated via an antenna.
“It is a major advantage of the photonic method that data streams from fiber-optical systems can directly be converted into high-frequency radio signals,” Professor Jürg Leuthold says. He proposed the photonic extension that was realized in this project. The former head of the KIT Institute of Photonics and Quantum Electronics (IPQ) is now affiliated with ETH Zurich. “This advantage makes the integration of radio relay links of high bit rates into optical fiber networks easier and more flexible.” In contrast to a purely electronic transmitter, no intermediate electronic circuit is needed. “Due to the large bandwidth and the good linearity of the photon mixer, the method is excellently suited for transmission of advanced modulation formats with multiple amplitude and phase states. This will be a necessity in future fiber-optical systems,” Leuthold adds.
Reception of radio signals is based on electronic circuits. In the experiment, a semiconductor chip was employed that was produced by the Fraunhofer Institute of Applied Solid State Physics (IAF) within the framework of the “Millilink” project. The semiconductor technology is based on high-electron-mobility transistors (HEMT) enabling the fabrication of active, broadband receivers for the frequency range between 200 and 280 GHz. The integrated circuits have a chip size of a few square millimeters only. The receiver chip can also cope with advanced modulation formats. As a result, the radio link can be integrated into modern optical fiber networks in a bit-transparent way.
Already in May this year the team succeeded in transmitting a data rate of 40 gigabits per second over a long distance in the laboratory using a purely electronic system. In addition, data were transmitted successfully over a distance of one kilometer from one high-riser to another in the Karlsruhe City center. “The long transmission distances in “Millilink” were reached with conventional antennas that may be replaced by fully integrated miniaturized antenna designs in future compact systems for indoor use,” says Professor Thomas Zwick, Head of the KIT Institut für Hochfrequenztechnik und Elektronik (Institute of High-Frequency Technology and Electronics). The present data rate can be still increased. “By employing optical and electrical multiplexing techniques, i.e., by simultaneously transmitting multiple data streams, and by using multiple transmitting and receiving antennas, the data rate could be multiplied,” says Swen König from the KIT Institute of Photonics and Quantum Electronics (IPQ), who conceived and conducted the recent world-record experiment. “Hence, radio systems having a data rate of 1 terabit per second appear to be feasible.”
Source: Science Daily
Mobile subscribers in North America and the Asia-Pacific region will be the drivers of a 13-fold growth rate in global mobile data traffic from 2012 – 2017, according to a new report from Cisco Systems.
Cisco’s latest Visual Networking Index Global Mobile Data Traffic Forecast report indicates that North American mobile subscribers will continue to consume the most data per subscriber per month of anyone else in the world. However, subscribers in the Asia-Pacific region will account for 47.1 percent of all mobile data traffic by 2017, up from 35 percent in 2012–making the Asia-Pacific region the largest in terms of data consumption.
Cisco’s annual report is widely cited every year by carriers and vendors alike as a key benchmark for measuring and predicting data traffic, and also as a data point to justify calls for network investment, traffic management technologies and more spectrum.
The report, Cisco’s sixth such one, estimates that over the next five years mobile data traffic will reach 11.2 exabytes–a billion gigabytes–per month. According to the forecast, in 2017 the average mobile subscriber worldwide will use 2 GB of data per month, up from around 200 MB in 2012, and they will consume around 10 hours of video per month, up from one hour in 2012.
Cisco’s projected growth rate is slower than it had forecasted last year. Last year Cisco said mobile data traffic will grow 18 times between 2011 and 2016 to 10.8 exabytes per month. Now Cisco thinks traffic will hit 7.4 exabytes per month in 2016.
Arielle Sumits, principal analyst for Cisco’s VNI Forecast, told FierceWireless that Cisco decided to take a more conservative approach in forecasting the growth rate of data from laptops connected to cellular networks. “It’s normal to see a little bit of the tapering in the growth rate over time,” she said.
Sumits said that Cisco “started to see this year, more than other years, a regional divergence” in mobile data traffic growth. North America, for instance, will see a dramatic increase in average mobile data usage per subscriber per month, the report predicts, going from an average of 752 MB per month in 2012 to 6 GB per month in 2017. Subscribers in Asia-Pacific will jump from using an average of 136 MB per month in 2012 to around 1.75 GB per month in 2017.
Subscribers in other regions will see similar jumps: in Western Europe subscribers will go from using 491 MB per month on average to 3.26 GB; in Latin America the growth will be from an average of 122 MB per month to 1.3 GB; in Central and Eastern Europe from 200 MB per month on average to 2.27 GB; and in the Middle East and Africa from just 73 MB per month on average to 990 MB per month.
However, the sheer volume of growth in Asia-Pacific will dwarf other regions, according to the VNI forecast. The forecast predicts that the number of mobile subscribers in the region will grow by 600 million from 2012 to 2017, up from 2.2 billion to 2.8 billion. The number of mobile devices and connections in the region will skyrocket from 3.47 billion to 5.24 billion. In North America, the growth will be much slower: The number of mobile users will climb from 288 million in 2012 to 316 million in 2017, and the number of devices and connections will jump from 459 million in 2012 to 841 million.
What will be driving this traffic growth? In North America, smartphones’ share of total data traffic will inch up slightly from 49 percent in 2012 to 52 percent in 2017, according to Cisco. Traffic from laptops will drop dramatically from 40 percent of all traffic in 2012 to just 13 percent in 2017. Tablet traffic will climb from 6.8 percent in 2012 to 28.3 percent in 2017 as shared data plans encourage more tablet adoption. Traffic from machine-to-machine applications will grow from 2.6 percent in 2012 to 6.6 percent in 2017.
In Asia-Pacific, the picture is different. There, smartphones are expected to make up the lion’s share of total data traffic over time as adoption increases and smartphone prices come down. Cisco forecasts that smartphones will make up 78 percent of all data traffic in the region in 2017, up from 46 percent in 2012. As in North America, traffic from laptops will drop, from 42 percent in 2012 to 11 percent in 2017. Tablets and M2M traffic will only make up 5.1 percent and 4.1 percent, respectively, of data traffic in the region in 2017, Cisco estimates.
One other notable aspect of the report is its take on the growth of LTE through 2017. Cisco predicts the number of 4G connections worldwide will steadily rise from 60.4 million in 2012 to 135.2 million in 2013 and up to around 992 million in 2017. That forecast is more conservative than a recent forecast from IHS iSuppli on LTE subscriber growth.
Cisco found that in 2012 only 1 percent of global connections were 4G but that 1 percent drove 14 percent of all global mobile data traffic. By 2017, 4G connections will represent 10 percent of global connections but will generate 45 percent of the data traffic. “That’s a huge jump,” said Thomas Barnett, director of service provider marketing at Cisco.
An Australian medical centre is reported to be considering paying a ransom demand of $4,000 AUD (APS2,600) after blackmailers broke into the organisation’s servers and encrypted its entire patient database.
According to ABC News, Miami Family Medical Centre on the country’s Gold Coast had called in a third-party contractor to try and restore the data from backups but it remained unclear whether this would prove sufficient to return the database to its previous state.
“We’re trying to work out how to pay the hackers or find someone to decrypt the information,” said centre co-owner David Wood.
The centre was continuing to receive patients but Wood admitted this was proving “very, very, very difficult” without patient records.
“What medication you’re on can be retrieved from the pharmacists [and] pathology results can be gotten back from pathology,” he told ABC News.
According to Wood, the attackers had accessed the database directly rather than using a remote Trojan.
“We’ve got all the antivirus stuff in place – there’s no sign of a virus. They literally got in, hijacked the server and then ran their encryption software,” he said.
“It’s people who know how to break in past firewalls and hack passwords to get onto the server.” No data had been compromised, Wood claimed.
The attack is not the first to affect medical centres in the country. Barely three months ago, dozens of business were reportedly hit by ransom malware and hijacking, including at least one other small medical businesses.
Not coincidentally, earlier this month US backup firm NovaStor reported an suspiciously similar attack on an unnamed US medical practice around Halloween that encrypted critical data including x-rays.
The business was able to beat the blackmailers thanks to NovaStor’s backup system which is probably the only reason the world got to hear about this near-disaster.
That is the obvious Achilles heel of ransom industry – cloud or offline backup. Any business or individual mirror data to a separate system that can’t itself be hacked should be able to defend itself against ransom attacks.
The wider phenomenon of data ransoming is overwhelmingly that of Trojans infecting individual PCs in order to encrypt consumers’ private data, but the latest Australian attack could be an example of a separate trend to target and attack specific types of business.
The criminals appear to favour targeting smaller businesses likely to be heavy with valuable data but lack the resources to back it up as comprehensively as might a larger organisation.
The culprits for the Miami Family Medical Centre are believed to be Russian, which fits with Trend Micro report from 2012 that suggested the core of the ransom industry could be a single gang.
A Symantec report analysed the boom in such attacks during the last year, suggesting that in the consumer space as many as three percent of victims probably paid up. That statistic was making the tactic hugely profitable, the company said.
German researchers analyzed a sample of 13,000 Android applications and found that more than 1,000 contained serious flaws in their SSL implementations.
The researchers from Leibniz University in Hannover and Philipps University of Marburg published this paper (PDF), showing their findings. They found that 17 percent of the SSL-using apps in their sample suffered from implementations that potentially made them vulnerable to man-in-the-middle MITM attacks.
The researchers claim they were “able to capture credentials from American Express, Diners Club PayPal, bank accounts, Facebook, Twitter, Google, Yahoo, Microsoft Live ID, Box, WordPress, remote control servers, arbitrary e-mail accounts, and IBM Sametime”.
In addition, since virus software also uses SSL, “We were able to inject virus signatures into an anti-virus app to detect arbitrary apps as a virus or disable virus detection completely.”
This issue has come about because of developers misusing the SSL settings the Android API offers.
Examples given by the researchers including apps that are instructed to trust all certificates presented to them. (There were 21 of 100 apps selected for a MITM test) of that 20 of the MITM-tested apps were configured to accepts certificates regardless of its associated hostname (for example, an app connecting to PayPal would accept a certificate from another domain). Other issues included SSL stripping and “lazy” SSL implementations by developers.
The researchers also noted that a number of apps provided insufficient feedback to users, for example, failing to tell the user whether or not it was using SSL to transmit user credentials.
Hewlett-Packard is working with T-Mobile USA to offer 200 MB of free HSPA+ data per month for two years to anyone who buys an HP notebook. More specifically the 11-inch Pavilion dm1 which HP is currently selling online for $400.
HP said that starting Oct. 26, anyone who buys the 11-inch Pavilion dm1 will get 200 MB of free data per month for 2 years. And customers do not need to sign a contract with T-Mobile for the data service. As well as the customers will get a free 25 GB account with Box, a cloud storage company.
This is similar to what Verizon Wireless did in 2010 by offering a free 100 MB per month for two years to those purchasing a notebook running Google’s Chrome OS.
HP’s offer is also similar in some respects to the one Amazon is making for its new Kindle Fire HD with LTE. For $50 per year, Amazon is offering users 250 MB of data per month from AT&T Mobility.
These are some of my tests I have been running to see how my T-Mobile data speeds have been, mostly while my phone is running on EDGE (Enhanced Data-rates for GSM Evolution) or 2G. No one at T-Mobile seems to have any idea as to why my speeds are so slowing because even running on EDGE or 2G I should average just above 100 kbps. There are many factors as to why my speeds are at times relatively fast like when I am home, I have a security system that runs off of the T-Mobile Network and that boosts my signal and speeds while in my house and not connected to WiFi. As you can see there are different locations and only a few of them I really see a nice average data speed, the rest are almost laughable data speeds, definitely not the speeds I want to see for the money I pay.
So far I have talked to 5 T-Mobile techs on the phone and I went into a T-Mobile store as well and they told me in the store to buy a new phone? It’s not my phone, because I am not an idiot an they can’t tell me something like that just to make me spend more money. I am using an unlocked Torch 9800 so I am using EDGE basically all the time unless I am connected to a Wifi network, which is ok for me, the thing is that I popped my SIM card into my Wife’s Bold 9780 on T-Mobile with 3G coverage and I still had these horrible speeds. So if anyone from T-Mobile reads this and thinks they can help please let me know, or if anyone else out there might have a suggestion please let me know but I have tried about everything and to me it looks like they have my data speeds throttled for some reason even though I have unlimited data plan and this has continued through almost two billing cycles. Look at my speeds below and compare, let me know if you feel my pain with T-Mobile or any other carrier that might just be brushing you off their shoulders like I feel they have done to me.
Gist Speed Test (Global Internet Speed Test)
- 1) 10:00- 31 kbps
2) 11:30- 43 kbps
3) 2:00- 5 kbps (didn’t even finish test)
- 1) 6:05- 71 kbps
2) 10:05- 141 kbps
3) 10:20- 179 kbps
- 1) 9:04- 14 kbps
2) 12:51- 32 kbps (Done after a battery pull cause phone got locked up data speed was too slow)
3) 2:17- 23 kbps (Walking from outside to inside office)
4) 4:30- 25 kbps (Done after a battery pull cause phone got locked up, data speed too slow)
- 5) 5:42- 24 kbps
- 6) 7:11- 209 kbps
7) 7:31- 233 kbps (While on the phone with T-Mobile Inside)
8) 7:45- 69 kbps (While on the phone with T-Mobile Outside)
- 1) 8:15- 27 kbps
2) 10:47- 29 kbps
3) 12:14- 20 kbps (Done after a battery pull cause phone got locked up, data speed too slow)
4) 1:33- 19 kbps
985 South Friendship Rd.
- 1) 7:36- 81 kbps
- 2) 7:55- 108 kbps
- 3) 8:15- 36 kbps
4) 9:13- 94 kbps
- 5) 8:45- 20 kbps
McEver (Waffle House)
- 1) 10:22- 190 kbps
Spout Springs (Target)
- 2)10:57- 138 kbps
85 South (Exit 99)
- 3) 12:01- 41 kbps
Johnson Ferry Rd.
- 4) 12:55- 69 kbps
- 1) 11:40- 31 kbps
12:22- Battery Pull
2) 12:55- 41 kbps
3:45- Battery Pull
- 3) 8:05- 39 kbps
Hwy 20 (Dunkin’ Donuts)
- 1) 7:26- 63/54 kbps
- 2) 2:23- 127 kbps
3) 3:18- 36 kbps
- 1) 9:02- 20 kbps
2) 2:06- 47 kbps
3) 2:58- 37 kbps (Twitter for Blackberry working for first time)
4) 4:52- 18 kbps (Everything is functional for First time since 5/23/2011)
- 1) 11:21- 16 kbps
Buford Dam Rd. (Shadburn Ferry Intersection)
- 1) 7:17- 133 kbps
- 2) 9:33- 41 kbps
- 3) 5:22- 29 kbps
Hwy 20 (Exiting 985)
- 4) 7:28- 86 kbps
- 1) 8:20- 55 kbps
Aesthetic Dermatology (Sanders Rd. Cumming, Ga.)- 2) 5:38- 215 kbps
3) 5:40- 226 kbps
Spout Springs Rd. (Ross)
- 1) 11:24- 16 kbps
Buford (Bona Allen Mansion)
- 2) 5:27- 218 kbps
Mundy Mill Rd. (Arby’s)
- 1) 8:11- 247 kbps
- 1) 11:45- 28 kbps
2) 2:32- 21 Kbps
3) 4:25- 27 Kbps
- 1) 12:38- 35 Kbps
2) 4:28- 21 Kbps
- 1) 8:24- 41 Kbps
- 1) 8:26- 32 Kbps
2) 12:15- 56 Kbps
3) 1:51- 95 Kbps (this test had to be a fluk)
4) 1:53- 31 Kbps
Atlanta Hwy. (Just Past Hall Middle)
- 1) 12:07- 46 Kbps
Mundy Mill (Walmart)
- 2) 12:53- 190 Kbps
Venture Rd. Buford, Ga. Mall of Ga.
(Babies R Us’)
- 1) 1:02- 78 Kbps
- 1) 8:25- 29 Kbps
Spout Springs (Chick-fil-A)
- 1) 8:25- 125 Kbps
- 1) 8:24- 21 Kbps
I have still not had any of my issues resolved with T-Mobile as of today, I spoke with another Blackberry Specialist that works for T-Mobile, making this my sixth time calling them. I spoke with him on Friday July 2, and he took all my information and wrote up a slip to give to an engineer to have them check out my specific network connection which took about 45 minutes to complete. He stated that it could take up to 72 hours to determine what is the exact problem and stated that I should not call back and that he would call me either Friday evening or Tuesday at some point. Well Friday passed, Tuesday passed, and now Wednesday has passed and I have yet to hear anything from anyone with T-Mobile. I will probably have to call them for the seventh time now and this time I might have to tell them to drop my service line, what’s the point in paying money for data that’s slower than my Mother’s DSL line she’s had since DSL became available, at least that’s how it seems, I mean I can’t even run a simple Twitter application on my Blackberry that’s how slow my data speeds are. Hopefully I will hear something today and I will post any results I get from T-Mobile and maybe they will fix my problem, or really it’s their problem.
Well today is Tuesday July, 12th and I have not heard back from T-Mobile on any kind of resolution to my data problems. I called T-Mobile on Friday July, 8th and spoke with another representative in the technical support department. He stated that he did not know why the rep that I spoke with on the previous Friday would tell me it would take 72 hrs. and that they really had no idea how long it would take the engineers to discover the problem. He checked the status of my complaint and the engineers’ slip we had sent out last Friday. Ended up he said that the engineers had to send my case to the ‘higher-ups’ is the way it was described to me and that it would take a little longer to find the root of the problem.
Doesn’t this just seem like too much of a hassle, it has been almost 2 months now that I have had these problems with my T-Mobile data connection. You would think they could just reset my network connection completely and reconnect me as if I were a new customer and I had just started using their network. Anyways they are supposed to give me a call back today and give me a status of my complaint, I hope they actaully give me a call today, we will see.