Monday 13 July 2015

Drones Could Help Snuff Out Future Wildfires


Drone
We all are living in the era of technology and we all are aware about the fact that each technology has its own pros and cons. However; in recent weeks we have seen the use of drones to monitor wildfires.

In last week of June 2015, emergency workers were fuming after drones in California, when it was flown by hobbyists and it get disrupted in firefighting efforts at southern portion of the state. According to reports, the drones were hovering at 11,000 feet in air which is much above than the legal 400 foot altitude limit and it forced DC-10 airplane to turn from the drop point which was loaded with 11,000 gallons of flame retardant. Apart from this DC-10 airplane, two other planes which were heading for same target also forced to abort their mission because of drone interface.

According to Los Angeles Times, this disruption was responsible for mission failure and it allowed the fire to spread which cost the loss between $ 10,000 and $ 15,000. But still researchers and tech experts are hoping that in future these frustrating firefighters can help and play a vital role in the prevention of wildfires. A team of researcher from University of California Berkeley has developed the system which uses drones, satellites and airplanes to detect the wildfires in particular region.

In search of flames: 

The system about which researchers are talking is known as the FUEGO (the Fire Urgency Estimation from Geosynchronous Orbit). This technology is useful to study supernovas in space and, at those points when they back towards the Earth. To start all this process satellites need to get the powerful infrared cameras with a purpose to monitor the patches of land in California or any other region.

Through cameras drones will be able to click the photos (in the wavelength of light which emitted by fires and are invisible for the naked eyes to) to send back it on the ground where land managers will able to track that particular geographical region. Additionally, airplanes and drones are equipped with infrared cameras which patrol fire-prone region to paint the higher resolution image about the risk of fire.

If drones will spot the fire, so soon they will come into action and provide real-time feedback about the nature and area of fire. It is expected that this technology will be helpful in night when aerial tankers will be grounded, so there should be someone to assist in mop-up operations to make sure that every ember is snuffed out from sky.

Is it concept? 

No, because according to researcher when this technology will be fully functional, so system will be able to detect and analyze even small fire in just two to five minutes. According to Carl Pennypacker, who is Astrophysicist at Lawrence Berkeley National Laboratory and UC Berkeley’s Space Sciences Laboratory and lead coordinator of the project, “In next few years we will launch the entire system as currently we don’t have satellite for that.”

However; currently team is testing drones which could sent the information about hot zones and can provide high quality images which could help in the development of fire prevention strategy.

New Horizons Hiccup won't Affect Pluto Mission Science


Pluto_Mission
Now space scientists are planning to return New Horizons on the normal science operations, just before its historic Pluto flyby because scientists have figure out that what was the reason for its weekend glitch, says NASA.

As per the statement of Jim Green, who is Director of planetary science in NASA, “I am pleased to inform that our mission teams have quickly identified the problem which assured the health and proper operation of the spacecraft”. According to Jim, with these insights of Pluto, we are on the verge to return on the normal operations and now going for the gold.

On last Saturday, team of New Horizons traced the reasons for failure and they found that it’s hard to detect the timing flaw when it comes spacecraft command sequence which occurred during the operations preparation for the flyby of July 14. However; it is well known fact that due to the flaw spacecraft went out of communication for the duration of more than 90 minutes.

But again it comes back in protective safe mode communication when they switched control from its primary backup computer. According to Glen Nagle, spokesman of Canberra Deep Space Communication Complex at Tidbinbilla, “Now again spacecraft is fully operation and we are able to download data and receive the commands”.

This spacecraft, which is of piano-size, let the scientists and engineers to know that it was good enough to receive and transmit the messages and commands. However; team members of New Horizons went through the routine troubleshooting to track down that particular glitch. Now scientists are putting all efforts to bringing the spacecraft in normal mode, but the whole process need few days meanwhile; for signals it takes four and a half hours to reach the probe when it travels with speed of light and it again takes four and a half hours to receive the response from spacecraft.

The operation triggered that same flaw will not happen again in future as currently New Horizons is 9.9 million kilometers or 6 million miles away from Pluto and in its flyby; it is traveling with the speed of 50,000 kilometers per hour or 30,000 mph.

According to NASA this outage will not affect the mission ability (cost of mission is $ 728 million) to meet its primary objectives. According to Alan Stern, who is mission's principal investigator and from Southwest Research Institute, “In terms of output it will not change A into A-plus”.

New Horizons was launched before nine years with sole objective to study the environment of Pluto and its moon. With the available instruments, spacecraft will map the surface of dwarf planet and gather the data about its composition to taste the dust in nitrogen-rich atmosphere which is in surrounding.

Apart from that the main objective craft will click the high resolution close picture of the dwarf planet to understand its surface composition. After flyby it’s expected that New Horizons will send back data in the period of 16 months or more however; currently team is drawing up plans for next flyby.

Saturday 11 July 2015

US Researchers Find A Way to Make Internet Access Cheaper, Faster


fiber_optic
Recently, US researchers have found a way to increase the speed of internet (data traveling) over a fiber optic network. It is expected that this technology will adopted commercial to provide cheaper and faster internet. According to Nikola Alic, a data scientist at Qualcomm Institute, which is part of the University of California at San Diego, this technique will take the couple of years to make a meaningful impact, but it’s all about implementation process and determination of the technical community. The more he added that whenever more you struggle the much faster you sink because it’s like quicksand.

US based researchers have found the way which is more significant to improve the performance of fiber networks as it could be benefit for both such as; ISP (Internet Service Providers) and consumers. The researchers have found a way to manage the distortion in internet network whenever you try to add power in it. It allows data to travel for longer distances before being reconditioned by electronic regenerator.

However; it’s clear that these findings will eliminate the need of electronic regenerators. These fiber optic cables are not only able to carry information or data for long distance, but also they don’t need power to supply data. Information and data in fiber optic cables get degrades with the distance travel and whenever you will try to increase the speed of internet at which data or information is traveling after getting boost from power in the network, so degradation gets worse.

The main concern with repeaters is that they should be applied anywhere from 80 to 200 data channels, but it could be highly power consuming as well as much expensive. However; to reduce the numbers of repeaters in the network, researchers have used the frequency combs of wideband to make sure that signal distortion which also known as the crosstalk.

Crosstalk occurs in the fiber to convert it into the original state after while arriving at its final destination. We all are aware that crosstalk was not random as it governed by high strict physical laws, but whenever we will look into the lab, it will appeared as the random which is still mystery, explained Alic.

According to Doug Brake, who is telecom policy analyst at The Information Technology & Innovation Foundation, it affects the way of sending the information over the fiber optic networks, but it is not a limitation factor while building the infrastructure at the last stage. In the whole process eventually, researchers have discovered the frequency variations in the entire communication channel, which need to tune at the source.

To produce some meaningful result in the lab researcher were able to retrieve the information after traveling 7,400 miles or 12,000 kilometers through fiber optic cables as well as standard amplifiers. According to cyber experts, this technology is meaningful and attractive for new ISP players such as; Google as they are adopting aggressive marketing strategies and technologies to capture the major market share.

Algorithm Accounts For Uncertainty To Enable More Accurate Modeling


Algorithm
Data Integration Algorithm – Improve Modelling Accuracy

Parametric uncertainty is a remarkable error source in modelling physical systems where the values of model parameters characterizing the system are not clear due to inadequate knowledge or limited data. In this case, data integration algorithm could improve modelling accuracy by calculatingand reducing this uncertainty. Nevertheless these algorithms frequently need a huge number of repetitive model evaluations, incurring important computational resource costs.

With regards to this issue, PNNL’s Dr Weixuan Li together with Professor Guang Lin from Purdue University proposed an adaptive position sampling algorithm which could alleviate the burden produced by computationally demanding models and in three of these test cases, they demonstrated the algorithm could effectively arrest the complex posterior parametric uncertainties for the precise problems examined while at the same time enhance the computational efficiency. With great headway in modern computers, numerical models are being used regularly for the purpose of stimulating physical system behaviours in scientific field which ranges from climate to chemistry and materials to biology, several of them within DOE’s serious mission zones.

Several Potential Applications

However, parametric uncertainty often tends to ascend in these models due to insufficient knowledge of the system being stimulated, resulting in models which diverge from reality. The algorithms created in this study offers active means to assume model parameters from any direct and/or indirect measurement data through uncertainty quantification, thereby improving model accuracy. This algorithm seems to have several potential applications for instance; it can be used to estimate the location not known, of an underground contaminant source as well as to improve that accuracy of the model which envisages how the groundwater tends to get affected by this source. Two of the key systems which has been implemented in this algorithm are

  •  A Gaussian mixture – GM model adaptively built in order to capture the distribution of uncertain parameters 
  •  A mixture of polynomial chaos – PC expansions which are built as a surrogate model in order to relieve the computational burden caused by forward model evaluation. These systems provide the algorithm with great flexibility in handling multimodal distributions and powerfully nonlinear models while at the same time, keeping the computational costs at the lowest level.
Worked Well with Small Number of Uncertain Parameters

Though the algorithm worked well for problem connecting with small number of uncertain parameters, constant research with regards to problems linking to bigger number of uncertain parameters indicated that it is better to re-parameterize the issue or represent it with lesser parameters rather than to directly illustrate from the high dimensional probability density function. Besides this it also involves implementing the algorithm within a consecutive importance sampling outline for successive data integration issues. One of the example problems comprises of dynamic state estimation of power grid system.

The New Microsoft Edge Browser Logo


Edge_Browser_Logo
Microsoft’s New Edge Browser – Break from Internet Explorer

Microsoft’s new Edge browser was likely to mark a break from Internet Explorer and recently the company unveiled Edge’s new logo and it seems incredibly similar to the earlier IE logo but a little sharper. Microsoft Edge, the new name for web browser will be included in Windows 10 and has a familiar look for users of Microsoft’s current browser – Internet Explorer. Since 1996, the popular blue `e’ with the orbital rings round it symbolized Microsoft’s Internet Explorer browser and IE 3 seemed to be the first version to sport the logo.

 However in recentyears, the `e’ seemed to become identical with bugs, out-dated technology and security problems. When Joe Belfiore, Microsoft operating systems chief, unveiled Edge in April, he noted that Edge would have an `e’ icon though he said that it `now has a completely different and better meaning than it has for a while’. Edge is an improvement over IE and much trimmer than IE without the various menu choices and bells and whistles. It has a clean, modern look just like Google’s GOOGL.Tech30, Chrome browser. One can mark-up websites by using Edge which integrates some of the powers of Microsoft’s’ digital assistant – Cortana.

Integrating on Improved Features 

Retaining the Edge logo is consistent with the Microsoft in considering Windows 10 and with the updated operating system set to be unveiled on July 29; Microsoft is making attempts in creating the software, familiar to users who have been using Windows for a long time and are alarmed by the radical design change of Windows 8. However the company is also integrating on improved features which would make Windows 10 much more than just an `everything old is new again’, operating system.

The new Edge loge seems to be an improvement over the temporary logo that Microsoft had given its new browser prior to making a decision on its official name. When Edge had been known as `Project Spartan’, its logo looked like something the World Bank’s logo crossed with Toyota’s symbol. Microsoft Edge is the only browser which enables the user to take notes, doodle, write and directly highlight on webpages, The web is the palette where one can add a few secret ingredients to a recipe right on the screen while sharing with friends, collaborate on a new project with co-workers, etc.

Default Browser on PC/Mobile Device Editions of Windows 10

Microsoft Edge would be serving as the default browser on the PC as well as the mobile device editions of windows 10, replacing Internet Explorer 11 and Internet Explorer Mobile. It would be utilising a new layout engine known as EdgeHTML which has branched from Trident, designed for inter-operability with the current web.

The updated `Edge’ engine would be used across Windows 10 by default and would support the legacy MSHTML engine for backwards compatibility though late,r backed down revealing that because of strong feedback, Edge would be hosting the new engine entirely while Internet Explorer would be hosting the legacy engine completely. Edge does not intend to support legacy technologies like ActiveX and Browser Helper Objects, but instead will utilise an extension system.

 Availability of Internet Explorer 11 would be possible together with Edge on Windows 10 for the purpose of compatibility and will remain almost identical to the Windows 8.1 version and will not use the Edge engine as announced earlier.