On the campaign trail in the USA, June 2020

Saturday, July 25, 2020

The following is the second edition of a monthly series chronicling the 2020 United States presidential election. It features original material compiled throughout the previous month after an overview of the month’s biggest stories.

This month’s spotlight on the campaign trail includes interviews with the vice presidential nominees of the Prohibition Party, Reform Party, and the Life and Liberty Party.

Retrieved from “https://en.wikinews.org/w/index.php?title=On_the_campaign_trail_in_the_USA,_June_2020&oldid=4598683”

Keep your eyes peeled for cosmic debris: Andrew Westphal about Stardust@home

Sunday, May 28, 2006

Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.

Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.

Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?

Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first US “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.

Is the video available to the public?

Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So

We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.

Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.

How many samples do you anticipate being found during the course of the project?

Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”

How big would the samples be?

We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.

And that’s on the main Stardust@home website [see below]?

Yes.

How long will the project take to complete?

Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!

And this is the huge advantage with this kind of a mission — a “sample return” mission.

Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!

When do you anticipate the project to start?

We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.

And rest assured that we’re just as frustrated!

I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?

The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!

Why did NASA decide to take the route of distributed computing? Will they do this again?

I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…

If I understand correctly it isn’t distributed computing, but distributed eyeballing?

…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).

That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen

Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?

That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?

Will a project like this be done again?

I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.

How did the idea come up to do this kind of project?

Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)

I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?

That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.

Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?

These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!

I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?

Can you elaborate on your question a little — I’m not sure that I understand…

Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?

That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.

I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.

I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!

Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?

The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.

We have some fun things, including micromachines.

How many people/participants do you expect to have?

About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!

One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!

Retrieved from “https://en.wikinews.org/w/index.php?title=Keep_your_eyes_peeled_for_cosmic_debris:_Andrew_Westphal_about_Stardust@home&oldid=4567689”

Assessing A Real Estate Market Population Movement

Submitted by: Raynor James

The number one question most people face is whether they should buy in a real estate market and, if so, when they should buy. Here is one trick you can use to make the decision.

Assessing a Real Estate Market Population Movement

If you did not know any better, you would be forced to assume that we Americans are a restless bunch. Unlike the 20th century, we rarely seem to stay in the same place for long. Factors that lead to this migratory conduct include job loss, tax issues, cost of living and so on. Given the fact we tend to move, most people fail to realize what impact this has on real estate markets.

[youtube]http://www.youtube.com/watch?v=xT5rcseeCIU[/youtube]

When assessing a real estate market, population movement is something you should really focus on. In many markets, it is pretty hard to ascertain and is not really an issue in the valuation and appreciation rates of properties. On the other hand, if you can identify such a market, you stand to make a killing. Let s look at a classic example.

One of the fastest, if not fastest, growing cities is Las Vegas, Nevada. Something like four people a minute move to the city. Why? Well, prices are generally cheaper than where they are moving from and there is no income tax collected by the state. As these people move into the area, the demand for housing becomes massive. This demand has been quantified in exploding appreciation rates in Las Vegas for the last few years. Alas, the market has cooled, but it does not mean you should avoid buying a property. People are still moving into the city at an amazing rate and they still need places to live. While the market has cooled for a bit, it will get hot again within a year. There is simple too much demand for housing.

A less obvious population movement has to do with homeowners moving from high property value areas to lower property value areas. For example, homeowners that live in Southern California have a habit of selling off their tract home for massive profits and moving to states where they can buy much bigger homes for a fraction of the price of their previous home. If you can identify where these wayward, but rich, souls are going, you can buy in that market before or while prices are being driven through the roof. One area that was very popular with these people was Seattle and the state of Washington. Prices have nearly doubled in that area given the influx of the Californians.

Population movement is an important trend in evaluating a real estate market. It is not always present in every market, but you can make a killing if it is.

About the Author: Raynor James is with

fsboamerica.org

– homes for sale by owner.

Source:

isnare.com

Permanent Link:

isnare.com/?aid=84023&ca=Real+Estate

Wikinews interviews Bill Hammons, Unity Party of America presidential nominee

Friday, October 23, 2020

Wikinews accredited reporter William S. Saturn reached out to Unity Party of America presidential nominee Bill Hammons of Colorado to discuss Hammons’s 2020 campaign for President of the United States.

Hammons, a former Newsweek manager and owner of the website “Bill’s List”, founded the Unity Party in 2004 with supporters of General Wesley Clark’s unsuccessful campaign for the Democratic Party’s presidential nomination. The party, which describes itself as centrist, advocates in its constitution for, among other things, a balanced budget amendment, elimination of the federal income tax, tax deduction for health care costs, a global minimum wage for fair trade, term limits for Congress and judges, lowering the voting age, DC statehood, and expanded space exploration.

Hammons has grown the party with various campaigns for public office.  He ran for US Congress in 2008 and 2010, US Senate in 2014 and 2016, and for Governor of Colorado in 2018.  Last year, Hammons embarked on a presidential campaign and became the Unity Party’s first presidential nominee. Engineer Eric Bodenstab, the party’s 2018 nominee for Lieutenant Governor of Colorado was picked to be his running mate. Bodenstab spoke to Wikinews last August. The Hammons-Bodenstab ticket has qualified for ballot access in Colorado, Louisiana, and New Jersey.

With Wikinews, Hammons discusses his background, campaign, the COVID-19 pandemic, the U.S. Supreme Court, and Black Lives Matter, among other issues.

Retrieved from “https://en.wikinews.org/w/index.php?title=Wikinews_interviews_Bill_Hammons,_Unity_Party_of_America_presidential_nominee&oldid=4598833”

Existing US home sales fall 9.6% in February

Monday, March 21, 2011

Sales of existing homes in the U.S. fell 9.6% in February, the National Association of Realtors (NAR) said today, in a sign that the U.S. housing market is still depressed. The figure was worse than the 3.9% decline anticipated by the economists surveyed by Dow Jones Newswires and questions whether the U.S. housing market is beginning to recover or will continue to fall.

We have an uneven, choppy recovery. Hopefully it is a recovery …

A combination of foreclosures and short sales, where the mortgage holder sells the house for less than owed on the mortgage, accounted for almost 40% of the sales.

Millions of foreclosures have forced down home prices and the number of foreclosures are predicted to rise this year. The inventory of existing homes listed for sale rose 3.5% at the end of February, a 8.6-month supply at the current sales rate. As more homes are listed in the spring, the inventory of houses for sale will probably increase. A five or six month inventory is usually considered a healthy balance between supply and demand.

According to Moody’s Analytics, another 3.6 million bank-owned homes and possible foreclosures will be added to the inventory by 2013, adding to the 6.7 million home foreclosures since 2006. Thus housing inventories will probably continue to remain high, delaying the point when prices stabilize. The median sales price in February fell 5.2%, down to a price level not seen since April 2002.

“We have an uneven, choppy recovery,” said NAR’s chief economist Lawrence Yun. “Hopefully it is a recovery that is taking place.”

Retrieved from “https://en.wikinews.org/w/index.php?title=Existing_US_home_sales_fall_9.6%25_in_February&oldid=1203626”

New Zealand pilot selling uniform online

Tuesday, August 15, 2006

“Jeremy”, an airplane pilot in Nelson, has put his uniform on sale for NZ$1.00 at New Zealand auction site TradeMe. The pilot had lost his job when Origin Pacific Airways fired 230 staff because of financial difficulties and is now trying to “make ends meet this week,” the pilot said. The auction has already reached 31 bids and reached $101 as of 7.30 a.m. August 15.

‘Jeremy’, the pilot, said on the auction page, that he is offering “a rare piece of Kiwi aviation history,”.

“This is an authentic Origin Pacific Jetstream pilot’s uniform lovingly drycleaned for one last time before its unfortunate retirement. It’s travelled many stormy nights and many sunrises and has seen almost 50,000 Kiwis safely to their destinations from Invercargill to the Far North. It has 778,000km on it, yet it looks as crisp today as it did when it was born,” the pilot described on TradeMe.

One bidder had offered $40 if he sold the uniform immediately but that offer was turned down in order to see the auction go until Sunday.

David Collier, Origin Pacific passenger services general manager, said “The uniform was Origin Pacific’s property and the pilot should not be selling it, but I think it’s probably one of those things you’d have to grin at and move on.”

Jay, Waitakere, asked on the auction page if the uniform came with a plane. The pilot replied “No, but if you want to start an airline I know where you can find a great bunch of people.”

The auction is closing on August 20, Sunday.

Retrieved from “https://en.wikinews.org/w/index.php?title=New_Zealand_pilot_selling_uniform_online&oldid=4550915”

Zara Kay tells Wikinews about her non-profit organisation Faithless Hijabi

Monday, July 6, 2020

A number of Muslim-majority countries around the world implement Shari’a — commonly known as Islamic law — and have laws against apostasy and blasphemy. Numerous times, over the years, people have been sentenced to death penalty for renouncing Islam. Back in 2018, a Pakistani journalism student Mashal Khan was killed by a mob lynch after he was accused of blasphemy. At times there have been protests against the restrictions on free speech in Islam.

Other than the restriction of free speech, many Muslim majority countries have declared homosexuality as a capital crime, and observe a strict dress code for women. Iran has banned a number of female chess players for not wearing a hij?b. An Iranian woman was sentenced 20 years for removing hij?b while protesting the strict dress code.

Wikinews had gotten in touch with Tanzanian-born ex-Muslim Zara Kay to discuss the struggles an ex-Muslim woman faces, as well as her organisation: Faithless Hijabi.  Faithess Hijabi is an organisation which helps other ex-Muslim women by sharing their stories and experiences.  Its Facebook page has over 7000 likes, and Zara Kay, who identifies herself as an antithiest, had prefiously helped a Saudi teenager Rahaf Mohammed escape to Canada.

The following is the interview with Zara Kay that took place last year.

[edit]

Retrieved from “https://en.wikinews.org/w/index.php?title=Zara_Kay_tells_Wikinews_about_her_non-profit_organisation_Faithless_Hijabi&oldid=4600020”

Japan raises severity level of crisis; efforts to cool damaged nuclear power plant continue

Friday, March 18, 2011

As the nuclear crisis in Japan’s crippled Fukushima I Nuclear Power Plant appears to worsen, Japan’s nuclear safety agency raised their assessment of its severity from 4 to 5 on the 7-level International Nuclear Event Scale, the same rating given the 1979 Three Mile Island crisis. Japan’s Prime Minister, Naoto Kan, said bluntly that the situation at the nuclear power plant was “very grave”. Weather forecasts indicate changing winds may begin moving radiation closer to Tokyo by March 30.

Efforts thus far to cool nuclear fuel in the reactors and the spent-fuel pools has produced little if any success, contends United States government officials.

Engineers are working frantically to connect electrical power to two reactors in the plant, as well as to restart the cooling systems and prevent overheating of fuel rods. Tokyo Electric Power Co. stated that it hopes to reconnect a power line needed to restart water pumps to the No. 1 and No. 2 reactors by Saturday morning. However, a TEPCO official cautioned that if the water pumps were damaged by the tsunami, they could fail to restart.

The extent of the damage to the plant’s reactors is still unclear. Japanese officials have concentrated on cooling spent fuel rods in Reactor No. 3’s storage pool. On Friday, however, steam was seen rising from Reactor No. 2., where an explosion occurred on Tuesday. Additionally, engineers said on Thursday that the steel lining of the storage pool at Reactor No. 4 and its concrete base seemed damaged, as attempts to refill the pool with water became increasingly difficult.

In a briefing on Friday, Philippe Jamet, a commissioner at France’s nuclear regulator Autorite de Surete Nucleaire, said, “We must avoid being overly optimistic. This will likely take human intervention like going into control rooms to reconnect valves.”

Retrieved from “https://en.wikinews.org/w/index.php?title=Japan_raises_severity_level_of_crisis;_efforts_to_cool_damaged_nuclear_power_plant_continue&oldid=3449780”

News briefs:April 28, 2005

Thursday, April 28, 2005

Retrieved from “https://en.wikinews.org/w/index.php?title=News_briefs:April_28,_2005&oldid=521307”

An Optimized It Management Process With The Cloud Technology

An Optimized IT Management Process with the Cloud Technology

by

ThomasW

Cloud computing is the technology that is in high demand right now. With the help of the cloud platform you can easily avail services from anywhere anytime with the help of a mobile device or a thin client. All the processes that are deployed on the cloud are easy to use and manage and this reduces your time and costs. In a world rife with competition you have to act wisely and promptly. The cloud services are quick which gives you an advantage to make critical business decisions accurately based on the real-time updates on the latest changes in the market.

The cloud computing technology has proved to be a boon in managing the IT costs. There is no requirement for any additional infrastructures or facilities. The infrastructure is highly integrated and leverages collaboration amongst the various IT processes and frameworks. Since compatibility between the processes is high the costs are reduced and efficiency is enhanced resulting in increased revenues and profits. The technology provides you with excellent network support services such as consistent and uninterrupted network connectivity and all latency issues are eliminated thus delivering an advanced and highly effective model. The cloud computing technology deploys best ITIL asset management practices that produce optimum results. It contributes productively towards asset discovery, creation and maintenance of hardware and software libraries, physical asset tracking, configuration management, procurement management, request and approvals, contract management, vendor and supplier management, re-deployment and movement, retire and disposal management. The cloud provides you with standardized and comprehensive IT security consulting services for prevention of data leakage. You are provided with secure SDLC design and coding practices, source code reviews, regular audits and reports that help you in modifying your security strategies for more effective protection. The cloud technology helps in the enforcement of a streamlined, automated, centralized IT management process that reduces all complexities and risks, and leverages performances. It provides you with consistent server monitoring service and enhances the visibility of the process. The processes that are deployed are energy-efficient and based on the lean management model that eliminates all wasteful procedures and helps in saving costs. You are also provided with superb service desk support that facilitates you to easily create requests, resolve incidents, reports incidents and review the status of your submitted requests. The cloud help desk helps to restore normal operations without delays to avoid any adverse impacts on your business. With such profound capabilities you can expect high returns on investment in business and minimal damages and losses.

Read more on –

[youtube]http://www.youtube.com/watch?v=A0nDwjZHBXs[/youtube]

asset management

,

IT security consulting

,

service desk support

,

data center

,

green datacenter

Article Source:

ArticleRich.com