Tuesday 14 July 2015

New Android Malware Sprouting Like Weeds


Android
If you own Android devices and looking for the way to minimize the risk of Android malware infection, so better to avoid the use of discount app stores. According to Andy Hayter, who is Security evangelist at G Data, “It’s recommended to not to download the apps from unknown app stores, but if you really trust them personally then you can go ahead”. The more he added that its recommended to install a malware scanner and on the same time check the permissions option (in settings of device) before installing any app.

As per the latest report of G Data Security Labs, All the information which are stored on an Android devices such as; smartphone and tablet are vulnerable to more than 4,950 new malware files. From past few years, Cybercriminals are taking much interest in the Android operating systems and according to Andy Hayter, Android devices are the bigger, easier and most profitable target for the bad guys in comparison of other platfroms. According to predication of G Data security Labs, There are more than 2 million new Android malware are about to surface in 2015.

Is it just starting? 

Android OS is a derivative of Linux, which considered as less targeted operating system by malware and viruses. But when it comes to Android devices then reality is absolutely different as Android OS is less secure and less rigorous in comparison of other mobile platforms, as per statement of Rob Enderle, Principal analyst of Enderle Group.

Latest reports as well as 2 million figure of G Data security Labs are realistic because in present much number of user’s are using the Android devices for online shopping and banking transactions. We all are aware about the fact that Android OS has more market share in comparison of iOS and Windows Phones and due to that Cybercriminals, security researchers and malware authors are more interested in Android OS. Last year, Google introduced premium SMS Checks and after that malware models started to spread in much faster way.

Android malware and Cybercriminals: 

If you will browse Google Play Store, so you will find that there are several paid and free apps are available and when it comes to install apps, so as normal user we prefer to use free Android apps. A developer of free Android apps depends on advertising to generate funds for further development, however; bad apps have ability and function to hide them in background. As per the repots of G Data security Labs, malware files are new financial foundation for Cybercriminals and in present more than 50 per cent Android devices are carrying SMS Trojans, Online shopping Trojans, Banking Trojans and other malware components.

In Europe 41 per cent and in US 50 per cent of consumers are using smartphones or tablets for banking transactions, however; 78 per cent internet users are making their purchase online through smartphones or tablets. Malware programs can install apps, steal your personal information or it can also steal your credit card or financial data for additional process.

Monday 13 July 2015

Drones Could Help Snuff Out Future Wildfires


Drone
We all are living in the era of technology and we all are aware about the fact that each technology has its own pros and cons. However; in recent weeks we have seen the use of drones to monitor wildfires.

In last week of June 2015, emergency workers were fuming after drones in California, when it was flown by hobbyists and it get disrupted in firefighting efforts at southern portion of the state. According to reports, the drones were hovering at 11,000 feet in air which is much above than the legal 400 foot altitude limit and it forced DC-10 airplane to turn from the drop point which was loaded with 11,000 gallons of flame retardant. Apart from this DC-10 airplane, two other planes which were heading for same target also forced to abort their mission because of drone interface.

According to Los Angeles Times, this disruption was responsible for mission failure and it allowed the fire to spread which cost the loss between $ 10,000 and $ 15,000. But still researchers and tech experts are hoping that in future these frustrating firefighters can help and play a vital role in the prevention of wildfires. A team of researcher from University of California Berkeley has developed the system which uses drones, satellites and airplanes to detect the wildfires in particular region.

In search of flames: 

The system about which researchers are talking is known as the FUEGO (the Fire Urgency Estimation from Geosynchronous Orbit). This technology is useful to study supernovas in space and, at those points when they back towards the Earth. To start all this process satellites need to get the powerful infrared cameras with a purpose to monitor the patches of land in California or any other region.

Through cameras drones will be able to click the photos (in the wavelength of light which emitted by fires and are invisible for the naked eyes to) to send back it on the ground where land managers will able to track that particular geographical region. Additionally, airplanes and drones are equipped with infrared cameras which patrol fire-prone region to paint the higher resolution image about the risk of fire.

If drones will spot the fire, so soon they will come into action and provide real-time feedback about the nature and area of fire. It is expected that this technology will be helpful in night when aerial tankers will be grounded, so there should be someone to assist in mop-up operations to make sure that every ember is snuffed out from sky.

Is it concept? 

No, because according to researcher when this technology will be fully functional, so system will be able to detect and analyze even small fire in just two to five minutes. According to Carl Pennypacker, who is Astrophysicist at Lawrence Berkeley National Laboratory and UC Berkeley’s Space Sciences Laboratory and lead coordinator of the project, “In next few years we will launch the entire system as currently we don’t have satellite for that.”

However; currently team is testing drones which could sent the information about hot zones and can provide high quality images which could help in the development of fire prevention strategy.

New Horizons Hiccup won't Affect Pluto Mission Science


Pluto_Mission
Now space scientists are planning to return New Horizons on the normal science operations, just before its historic Pluto flyby because scientists have figure out that what was the reason for its weekend glitch, says NASA.

As per the statement of Jim Green, who is Director of planetary science in NASA, “I am pleased to inform that our mission teams have quickly identified the problem which assured the health and proper operation of the spacecraft”. According to Jim, with these insights of Pluto, we are on the verge to return on the normal operations and now going for the gold.

On last Saturday, team of New Horizons traced the reasons for failure and they found that it’s hard to detect the timing flaw when it comes spacecraft command sequence which occurred during the operations preparation for the flyby of July 14. However; it is well known fact that due to the flaw spacecraft went out of communication for the duration of more than 90 minutes.

But again it comes back in protective safe mode communication when they switched control from its primary backup computer. According to Glen Nagle, spokesman of Canberra Deep Space Communication Complex at Tidbinbilla, “Now again spacecraft is fully operation and we are able to download data and receive the commands”.

This spacecraft, which is of piano-size, let the scientists and engineers to know that it was good enough to receive and transmit the messages and commands. However; team members of New Horizons went through the routine troubleshooting to track down that particular glitch. Now scientists are putting all efforts to bringing the spacecraft in normal mode, but the whole process need few days meanwhile; for signals it takes four and a half hours to reach the probe when it travels with speed of light and it again takes four and a half hours to receive the response from spacecraft.

The operation triggered that same flaw will not happen again in future as currently New Horizons is 9.9 million kilometers or 6 million miles away from Pluto and in its flyby; it is traveling with the speed of 50,000 kilometers per hour or 30,000 mph.

According to NASA this outage will not affect the mission ability (cost of mission is $ 728 million) to meet its primary objectives. According to Alan Stern, who is mission's principal investigator and from Southwest Research Institute, “In terms of output it will not change A into A-plus”.

New Horizons was launched before nine years with sole objective to study the environment of Pluto and its moon. With the available instruments, spacecraft will map the surface of dwarf planet and gather the data about its composition to taste the dust in nitrogen-rich atmosphere which is in surrounding.

Apart from that the main objective craft will click the high resolution close picture of the dwarf planet to understand its surface composition. After flyby it’s expected that New Horizons will send back data in the period of 16 months or more however; currently team is drawing up plans for next flyby.

Saturday 11 July 2015

US Researchers Find A Way to Make Internet Access Cheaper, Faster


fiber_optic
Recently, US researchers have found a way to increase the speed of internet (data traveling) over a fiber optic network. It is expected that this technology will adopted commercial to provide cheaper and faster internet. According to Nikola Alic, a data scientist at Qualcomm Institute, which is part of the University of California at San Diego, this technique will take the couple of years to make a meaningful impact, but it’s all about implementation process and determination of the technical community. The more he added that whenever more you struggle the much faster you sink because it’s like quicksand.

US based researchers have found the way which is more significant to improve the performance of fiber networks as it could be benefit for both such as; ISP (Internet Service Providers) and consumers. The researchers have found a way to manage the distortion in internet network whenever you try to add power in it. It allows data to travel for longer distances before being reconditioned by electronic regenerator.

However; it’s clear that these findings will eliminate the need of electronic regenerators. These fiber optic cables are not only able to carry information or data for long distance, but also they don’t need power to supply data. Information and data in fiber optic cables get degrades with the distance travel and whenever you will try to increase the speed of internet at which data or information is traveling after getting boost from power in the network, so degradation gets worse.

The main concern with repeaters is that they should be applied anywhere from 80 to 200 data channels, but it could be highly power consuming as well as much expensive. However; to reduce the numbers of repeaters in the network, researchers have used the frequency combs of wideband to make sure that signal distortion which also known as the crosstalk.

Crosstalk occurs in the fiber to convert it into the original state after while arriving at its final destination. We all are aware that crosstalk was not random as it governed by high strict physical laws, but whenever we will look into the lab, it will appeared as the random which is still mystery, explained Alic.

According to Doug Brake, who is telecom policy analyst at The Information Technology & Innovation Foundation, it affects the way of sending the information over the fiber optic networks, but it is not a limitation factor while building the infrastructure at the last stage. In the whole process eventually, researchers have discovered the frequency variations in the entire communication channel, which need to tune at the source.

To produce some meaningful result in the lab researcher were able to retrieve the information after traveling 7,400 miles or 12,000 kilometers through fiber optic cables as well as standard amplifiers. According to cyber experts, this technology is meaningful and attractive for new ISP players such as; Google as they are adopting aggressive marketing strategies and technologies to capture the major market share.

Algorithm Accounts For Uncertainty To Enable More Accurate Modeling


Algorithm
Data Integration Algorithm – Improve Modelling Accuracy

Parametric uncertainty is a remarkable error source in modelling physical systems where the values of model parameters characterizing the system are not clear due to inadequate knowledge or limited data. In this case, data integration algorithm could improve modelling accuracy by calculatingand reducing this uncertainty. Nevertheless these algorithms frequently need a huge number of repetitive model evaluations, incurring important computational resource costs.

With regards to this issue, PNNL’s Dr Weixuan Li together with Professor Guang Lin from Purdue University proposed an adaptive position sampling algorithm which could alleviate the burden produced by computationally demanding models and in three of these test cases, they demonstrated the algorithm could effectively arrest the complex posterior parametric uncertainties for the precise problems examined while at the same time enhance the computational efficiency. With great headway in modern computers, numerical models are being used regularly for the purpose of stimulating physical system behaviours in scientific field which ranges from climate to chemistry and materials to biology, several of them within DOE’s serious mission zones.

Several Potential Applications

However, parametric uncertainty often tends to ascend in these models due to insufficient knowledge of the system being stimulated, resulting in models which diverge from reality. The algorithms created in this study offers active means to assume model parameters from any direct and/or indirect measurement data through uncertainty quantification, thereby improving model accuracy. This algorithm seems to have several potential applications for instance; it can be used to estimate the location not known, of an underground contaminant source as well as to improve that accuracy of the model which envisages how the groundwater tends to get affected by this source. Two of the key systems which has been implemented in this algorithm are

  •  A Gaussian mixture – GM model adaptively built in order to capture the distribution of uncertain parameters 
  •  A mixture of polynomial chaos – PC expansions which are built as a surrogate model in order to relieve the computational burden caused by forward model evaluation. These systems provide the algorithm with great flexibility in handling multimodal distributions and powerfully nonlinear models while at the same time, keeping the computational costs at the lowest level.
Worked Well with Small Number of Uncertain Parameters

Though the algorithm worked well for problem connecting with small number of uncertain parameters, constant research with regards to problems linking to bigger number of uncertain parameters indicated that it is better to re-parameterize the issue or represent it with lesser parameters rather than to directly illustrate from the high dimensional probability density function. Besides this it also involves implementing the algorithm within a consecutive importance sampling outline for successive data integration issues. One of the example problems comprises of dynamic state estimation of power grid system.