Microsoft – EvaluateSolutions38 https://evaluatesolutions38.com Latest B2B Whitepapers | Technology Trends | Latest News & Insights Wed, 19 Apr 2023 16:43:08 +0000 en-US hourly 1 https://wordpress.org/?v=5.8.6 https://dsffc7vzr3ff8.cloudfront.net/wp-content/uploads/2021/11/10234456/fevicon.png Microsoft – EvaluateSolutions38 https://evaluatesolutions38.com 32 32 Google Is Planning to Enhance its Existing AI Features and Develop a New Search Engine https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/google-is-planning-to-enhance-its-existing-ai-features-and-develop-a-new-search-engine/ https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/google-is-planning-to-enhance-its-existing-ai-features-and-develop-a-new-search-engine/#respond Wed, 19 Apr 2023 16:43:08 +0000 https://evaluatesolutions38.com/?p=52095 Highlights:

  • According to The Times, Google’s project needs a clearer timeline, making it unclear when the features might go live.
  • The Times reports that Google plans to debut Magi next month before adding more new features in the following months.

When its core business is under the most serious threat in years, Google LLC is reportedly rushing to introduce new features and capabilities in its search engine powered by artificial intelligence.

According to reports, the company is developing a brand-new, AI-powered search engine and is considering updating its current search technology with AI features.

The changes are Google’s response to Samsung Electronics Co. Ltd.’s suggestion that it might stop using Google Search and switch to Microsoft Bing as its default mobile search engine, the New York Times reported recently.

According to The Times, Google could lose Samsung and suffer a loss of revenue of more than USD 3 billion annually. The suggestion allegedly caused widespread “panic” within Google; as a result, forcing the company to scramble to keep up with the surge in demand for technologies like ChatGPT.

As details obtained by the Times from internal emails, Google’s response is to update its search engine as part of a project called “Magi.” According to reports, Google has 160 employees working in “sprint rooms” to develop new AI-powered Google Search features.

Since December of last year, when executives first understood the significance of OpenAI LP’s ChatGPT and how it might present a problem for search, Google is said to have been in a frenzy. When Microsoft Corp. revealed plans to integrate ChatGPT with Bing in February, the threat to Google’s decades-long search market dominance only grew. Sundar Pichai, the CEO of Google, responded by pledging soon to update Google Search with new AI chat features.

A new service that will try to predict what users are looking for before they search is one of the new features Google is developing as part of a “more personalized” experience. According to The Times, Google’s project needs a clearer timeline, making it unclear when the features might go live.

A Chrome feature called “Searchalong” that would scan the website the user is reading and provide contextual information is among the other new features rumored to be in the works. The business is also developing a chatbot that can provide code snippets in response to questions about software engineering. A second chatbot would aid in music discovery. More experimental features, including “GIFI” and “Tivoli Tuto,” are also being developed, allowing users to ask Google Image Search to create images and communicate with a chatbot in a different language.

However, it should be noted that many of these features are only partially original. For instance, There is an existing image generation function in Slides, and Tivoli Tutor sounds a lot like Duolingo Inc.’s learning app.

Google’s apparent panic and haste to enhance the capabilities of its search engine, according to analyst Charles King of Pund-IT Inc., shows how flawed the ad-based search model has grown. He said, “Once upon a time, a search engine’s value was based on the quality of results it delivered, but today it’s likely that the top five or ten results you see for any given search will consist of sponsored ad links from some commercial entity.”

As a result, all internet users could gain from improved search capabilities. King said he would be surprised if Google couldn’t produce new AI-based tools that are at least on par with Microsoft’s, if not superior to them.

King said, “That said, the history of the tech industry is littered with stories of once-unstoppable firms that were undermined by more nimble and advanced competitors. Remember when Microsoft Explorer dominated the browser market to the point that the company was successfully challenged on anti-trust grounds? Then along came Google Chrome. Maybe this is just the latest tale of ‘what goes around, comes around'”.

According to Constellation Research Inc.’s Holger Mueller, who is more upbeat, Google’s plan to create a brand-new search engine based on generative AI makes sense because incremental innovation may only go so far in developing next-generation search. The analyst said, “At the same time, the coming reported updates are a good move as they can hedge against Microsoft Bing’s new AI capabilities. Though Google will in any case need to be cautious, as the verdict is still out on whether or not generative AI can really improve search experiences.”

According to The Times, Google plans on introducing Magi in the coming month before following up with more features in the fall. According to this timeline, more information about Magi might be made public on May 10 during Google I/O 2023. According to reports, Google intends to first make Magi’s features available to 1 million test subjects before making them available to 30 million users by the end of the year. Magi will initially only be made available in the United States.

Google declined to respond to the Times’ claims in a statement directly but claimed that it has been incorporating AI capabilities into Google Search for years through features like Lens and multisearch, among others.

A Google spokesperson said, “We’ve done so in a responsible and helpful way that maintains the high bar we set for delivering quality information. Not every brainstorm deck or product idea leads to a launch, but as we’ve said before, we’re excited about bringing new AI-powered features to Search, and will share more details soon.”

]]>
https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/google-is-planning-to-enhance-its-existing-ai-features-and-develop-a-new-search-engine/feed/ 0
Recorded Future Introduces GPT-powered Threat Analytics Model https://evaluatesolutions38.com/news/security-news/recorded-future-introduces-gpt-powered-threat-analytics-model/ https://evaluatesolutions38.com/news/security-news/recorded-future-introduces-gpt-powered-threat-analytics-model/#respond Thu, 13 Apr 2023 13:56:45 +0000 https://evaluatesolutions38.com/?p=51990 Highlights:

  • Three years ago, Insight Partners bought the company’s majority of shares. The purchase price was more than USD 780 million.
  • One of many businesses using OpenAI’s GPT family of language models to assist cybersecurity teams in their work is Recorded Future.

A cybersecurity tool that uses an OpenAI LP artificial intelligence model to identify threats was just released by Recorded Future Inc.

The software platform by Boston-based Recorded Future enables businesses to monitor hacker activity. The platform, for instance, can be used by a bank to find new malware campaigns that target the financial industry. Recorded Future says that over 50 percent of the Fortune 100 companies use its technology.

Three years ago, Insight Partners bought the majority of the company. It was worth more than USD 780 million in the agreement.

The new tool that the company unveiled recently, Recorded Future AI, is built using a neural network from OpenAI’s GPT series of large language models. The most recent neural network in the GPT series, GPT-4, debuted last month. There are also more than a dozen additional AI models in the product line with various feature sets.

Companies continuously gather information about user activity, applications, and hardware in their networks to identify breaches. In the past, cybersecurity teams manually examined that data to look for fraudulent activity. The goal of Recorded Future AI is to make the task easier.

The business claims that its new tool automatically locates breach indicators in a company’s network and ranks them according to their seriousness. It also identifies weaknesses. For instance, the tool can determine whether a server has a configuration error that enables users to log in without a password.

The Recorded Future AI promises to accelerate several additional tasks as well.

Cybersecurity teams regularly produce reports for executives as part of their work that describes how well the corporate network is protected and where improvements can be made. Analysts must manually collect technical data from various systems in order to create such a report. The process could be sped up by several days with the help of Recorded Future AI’s promise to automate some steps.

Christopher Ahlberg, Co-founder and Chief Executive Officer said, “Now, with Recorded Future AI, we believe we can eliminate the cyber skills shortage and increase the capacity for cyber readiness by immediately surfacing actionable intelligence.”

One hundred terabytes of cybersecurity data were used to train the GPT model that Recorded Future obtained from OpenAI to create the tool. The startup’s eponymous software platform was used to gather the data. The platform offers businesses data on vulnerabilities, cyberattacks, and the servers which hackers utilize to launch malware campaigns.

The tool also uses research from the Insikt Group research group of the startup. The 40,000 analyst notes on online threats that the Insikt Group has produced over the years are included, in particular. Cybersecurity teams employ these analyst notes to describe hacker strategies and disseminate associated technical data.

One of many businesses using OpenAI’s GPT family of language models to assist cybersecurity teams in their work is Recorded Future.

Microsoft Corp. unveiled Security Copilot, a service that uses the most recent GPT-4 model from OpenAI last month. During a breach attempt, the service automatically detects malicious activity and predicts the next moves a hacker is likely to make. Cybersecurity teams can use security Copilot’s data to guide their efforts to address breaches.

]]>
https://evaluatesolutions38.com/news/security-news/recorded-future-introduces-gpt-powered-threat-analytics-model/feed/ 0
Now-patched Azure Opens Up a Door to Remote Takeover Attacks https://evaluatesolutions38.com/news/security-news/now-patched-azure-opens-up-a-door-to-remote-takeover-attacks/ https://evaluatesolutions38.com/news/security-news/now-patched-azure-opens-up-a-door-to-remote-takeover-attacks/#respond Mon, 03 Apr 2023 16:29:43 +0000 https://evaluatesolutions38.com/?p=51707 Highlights:

  • According to Orca, exploiting the Azure flaw grants threat actors full administrator access to a Service Fabric cluster.
  • CVE-2022-35829, also known as FabriXss, was discovered by Orca Security’s research team.

Microsoft Corp.’s Azure was found to have a previously undiscovered vulnerability that permitted remote code execution by attackers, according to cloud cybersecurity company Orca Security Ltd.

The “Super FabriXss” flaw, which showed how to escalate a mirrored cross-site scripting vulnerability in Azure Service Fabric Explorer, was demonstrated at BlueHat IL 2023. The example shows how unauthorized Remote Code Execution may take advantage of the metrics tab to activate a particular toggle in the console, the “Cluster Type” toggle.

Cross-site scripting, or XXS, a vulnerability known as Super FabriXss, affects Azure Service Fabric Explorer, claims Orca. Remote, unauthenticated attackers can commandeer a container hosted on a Service Fabric node to run code.

Without needing authentication, remote code execution on a container hosted on a Service Fabric node can be accomplished via the XSS vulnerability. The XSS flaw becomes a full RCE vulnerability when a user clicks on a specially constructed malicious link and changes the “Cluster” Event Type setting under the Events tab.

The vulnerability has to be exploited in two steps. In the first, an embedded iframe is used to start a fetch request. Afterward, the attacker’s malware uses the upgrade procedure to replace the current deployment with a new, malicious one. The new deployment’s Dockerfile contains a CMD command to download a remote.bat file.

After the .bat file is downloaded, and run, it then looks for another file with an encoded reverse shell. The attacker can take over the cluster node hosting the container by using the reverse shell to obtain remote access to the target system.

The vulnerability was disclosed to the Microsoft Security Response Center by Orca Security prior to making it public. After looking into the problem, Microsoft gave it the CVE-2023-23383 designation and a Common Vulnerability Scoring System score of 8.2, which denotes “important” severity. After that, Microsoft issued a remedy incorporated with the most current March 2023 Patch Tuesday release.

The vulnerable version of Service Fabric Explorer is 9.1.1583.9590 or earlier. The Orca researchers advise users to upgrade Service Fabric Explorer if they still need to to prevent exposure.

]]>
https://evaluatesolutions38.com/news/security-news/now-patched-azure-opens-up-a-door-to-remote-takeover-attacks/feed/ 0
Microsoft Investigates Bing Chatbot Ads https://evaluatesolutions38.com/news/tech-news/microsoft-investigates-bing-chatbot-ads/ https://evaluatesolutions38.com/news/tech-news/microsoft-investigates-bing-chatbot-ads/#respond Mon, 03 Apr 2023 14:58:49 +0000 https://evaluatesolutions38.com/?p=51698 Highlights:

  • Mehdi said that Microsoft is looking into adding an “expanded hover experience” to that part of the interface.
  • The GPT-4 model that OpenAI LP showed off earlier this month is used to power Bing Chat, but it has been changed to work better with Bing Chat.

Microsoft Corp. has confirmed that it plans to put ads into Bing Chat to make money.

In a blog post on Wednesday, Yusuf Mehdi, the corporate vice president of Microsoft’s modern life, search, and devices group, said that the company is “looking into putting ads in chat”. The executive also said that publishers would get a share of the money made from these ads.

Last month, Reuters said that Microsoft had talked with a “major advertising agency” about putting ads in Bing Chat. The content being promoted could come in different forms. Microsoft has reportedly tested paid links below Bing Chat answers and a second ad format that targets consumers when they look for a product or service.

It’s been said that some Bing Chat users have already seen ads in the service’s interface.

When a user types in a question, the service usually responds with more than one sentence. After each sentence, Bing Chat links to the page from which it got the information. Some users have said that these reference shortcuts are sometimes replaced with icons that say “Ad” and have a paid link.

When Mehdi confirmed the ad feature, he also said that the company is looking into alternate ways to promote publishers’ content in Bing Chat.

Bing Chat’s response to any search query lists a few links to the original source of information. Mehdi said that Microsoft is looking into adding an “expanded hover experience” to that part of the interface. Bing Chat automatically shows related links when a user hovers over a website link.

Mehdi says adding a new Microsoft Start panel to the Bing Chat interface is another idea. Microsoft Start is an app that lets users get news from several publishers. The company could add a panel to Bing Chat that would display content from the app with responses to search queries.

Mehdi wrote, “The early progress is encouraging. Based on our data from the preview, we are driving more traffic from all types of users. We have brought more people to Bing/Edge for new scenarios like chat and we are seeing increased usage. Then, we have uniquely implemented ways to drive traffic to publishers including citations within the body of the chat answers.”

The GPT-4 model that OpenAI LP showed off earlier this month is used to power Bing Chat, but it has been changed to work better with Bing Chat. The service also uses Prometheus, a second neural network made by Microsoft. The second system makes the citations that Bing Chat includes in its answers.

Microsoft has changed Bing Chat since it was first released a month ago. The chatbot can give answers that are fair, accurate, or creative. Microsoft put out an update two weeks ago that lets users ask more questions in one chat session and speeds up the balanced answer mode in Bing Chat.

]]>
https://evaluatesolutions38.com/news/tech-news/microsoft-investigates-bing-chatbot-ads/feed/ 0
Replit and Google Cloud Collaborate to Improve Generative AI for Software Development https://evaluatesolutions38.com/news/cloud-news/replit-and-google-cloud-collaborate-to-improve-generative-ai-for-software-development/ https://evaluatesolutions38.com/news/cloud-news/replit-and-google-cloud-collaborate-to-improve-generative-ai-for-software-development/#respond Thu, 30 Mar 2023 17:22:07 +0000 https://evaluatesolutions38.com/?p=51687 Highlights:

  • The partnership will hasten the development of generative AI applications and demonstrates Google Cloud’s dedication to fostering the most open generative AI ecosystem.
  • Replit is one of the fastest-growing developer platforms, with over 20 million developers and over 240 million projects developed.

Recently, the cloud division of Google LLC announced a partnership with Replit Inc., the company behind a well-known coding platform used by over 20 million programmers.

Replit will employ Google’s extensive language models as part of the agreement to assist programmers in writing applications more quickly. The businesses also want to develop a variety of product connections.

An IDE, or integrated development environment, is a program software development teams use to write code. These apps integrate word processing functions with other productivity tools for developers. Replit offers a well-liked cloud-based IDE of the same name that is meant to be more user-friendly than conventional options.

The startup said last month that more than 20 million developers have signed up for its platform since its introduction. It closed a USD 80 million investment round headed by Coatue in December 2021, closing with 10 million developers. To date, it has received more than USD 104 million in fundraising.

Replit will have access to large language models thanks to its new partnership with Google Cloud. The business intends to add the innovation to Ghostwriter Chat, an AI-powered coding tool. The tool can explain current code, debug software issues, and generate code based on developers’ text prompts into Replit’s cloud-based IDE.

Vice president of cloud AI and industry solutions at Google Cloud, June Yang, said, “Generative AI can bring significant new capabilities to businesses and developers. Google Cloud infrastructure and foundation models in Vertex AI will power Replit’s widely adopted platform, delivering more performance and scalability to millions of developers around the world.”

Replit and Google Cloud intend to work together on other projects. The startup announced that it would host its cloud-based IDE using the infrastructure of the search engine giant. Also, the companies are developing product integrations to make it simple for Replit users to integrate Google Cloud services into their software projects.

Users of the startup will be given a simple method to get set up on GCP products by the Alphabet Inc. unit. Replit is now Google’s preferred IDE as well. Developers that utilize the software engineering tools in the Workspace productivity suite and Google Cloud Services will have access to the IDE.

One of the biggest cloud competitors to the search giant, Microsoft Corp., is also investing in AI-powered developer tooling. Its GitHub subsidiary unveiled a new version of its Copilot coding tool last week as part of the effort. The tool can generate both code and command line queries and is partially based on OpenAI LLC’s most recent GPT-4 model.

]]>
https://evaluatesolutions38.com/news/cloud-news/replit-and-google-cloud-collaborate-to-improve-generative-ai-for-software-development/feed/ 0
Microsoft Announces Major Expansions to the Cloud Partner Program https://evaluatesolutions38.com/news/cloud-news/microsoft-announces-major-expansions-to-the-cloud-partner-program/ https://evaluatesolutions38.com/news/cloud-news/microsoft-announces-major-expansions-to-the-cloud-partner-program/#respond Thu, 23 Mar 2023 17:24:01 +0000 https://evaluatesolutions38.com/?p=51632 Highlights:

  • Microsoft recently announced an expansion of the Solutions Partner designations, which were introduced to help customers more accurately identify the partner companies that can most assist them.
  • Microsoft announced that it is extending the scope of its artificial intelligence services for partners, including ChatGPT in the Azure Open AI Service.

Microsoft Corp. has updated its Cloud Partner Program, giving partners new ways to stand out in the marketplace with their unique goods and services.

When it was introduced in October 2022, the Cloud Partner Program was positioned as an advancement of the previous Microsoft Partner Network. Microsoft recently announced an expansion of the Solutions Partner designations, which were introduced to help customers more accurately identify the partner companies that can most assist them. This shows that the industry is still evolving.

The designations are intended to help partners demonstrate their abilities in delivering customer success within six key solution areas, according to Nicole Dezen, Microsoft’s Chief Partner Officer and Corporate Vice President of Global Partner Solutions. Microsoft is now broadening these designations to distinguish independent software vendors or those business partners who concentrate solely on creating software. The new ISV designations include categories like industry, use cases across industries, and particular business-leader imperatives like marketing and sales.

Dezen explained, “These designations distinguish an application’s specific capabilities and help customers identify proven solutions for their business needs.”

Partners can also distinguish themselves in the market with new Business Applications specializations demonstrating companies’ expertise and experience in specific technical scenarios. They are aligned with the Business Applications solution area for Microsoft Dynamics 365 and will help customers find ideal partners, according to Dezen.

Finance specializations are for partners who can demonstrate knowledge, substantive experience, and documented success in Dynamics 365 Finance; Sales specializations are for partners who have proven experience in customer experience transformation; Service specializations are for partners who have experience delivering personalized services; and Supply Chain specializations are for partners who have the expertise and documented success in Dynamics 365 Supply Chain Management.

According to Dezen, Microsoft’s Cloud Partner Program now offers 28 specializations, providing partners many options for demonstrating their expertise to customers.

Microsoft partners will soon be able to emphasize their diversity and efforts to advance “social good” and differentiate themselves through their expertise. Customers will find diversely led companies and solutions in Microsoft’s commercial marketplace more easily; as a result, it is hoped.

Dezen said, “Inclusion fosters innovation, and many customers are looking to find partners that align to their business values. This is why we are enabling partners to submit relevant diversity and social good business classifications in Partner Center, which will appear in their business profile in the marketplace.”

Dezen said that although many of Microsoft’s partners are direct rivals, the company still hopes to promote greater cooperation. To achieve this, it is launching multiparty private offers. This brand-new service enables partners to collaborate, develop unique offers with unique payouts, and market their products directly to Microsoft customers via its cloud marketplace. In the spring, multiparty private offers will go live in the preview. Later this year, they will become generally available.

Last but not least, Microsoft announced that it is extending the scope of its artificial intelligence services for partners, including ChatGPT in the Azure Open AI Service. Microsoft announced that beginning in April, partners can apply for preview access to GPT-4, a more potent large language model created by OpenAI LP, which created ChatGPT. GPT-4 will allow partners to create predictive models, automate procedures, and enhance decision-making for themselves and their clients.

Dezen said, “We believe AI will fundamentally change every software category, unlocking a new wave of productivity growth. We are committed to empowering partners to harness the power of this innovation on behalf of customers around the world, while helping them navigate this new era of technology.”

]]>
https://evaluatesolutions38.com/news/cloud-news/microsoft-announces-major-expansions-to-the-cloud-partner-program/feed/ 0
Microsoft and Adobe Launch AI Image Generators as Competitiveness Heats Up https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/microsoft-and-adobe-launch-ai-image-generators-as-competitiveness-heats-up/ https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/microsoft-and-adobe-launch-ai-image-generators-as-competitiveness-heats-up/#respond Wed, 22 Mar 2023 17:33:59 +0000 https://evaluatesolutions38.com/?p=51612 Highlights:

  • Microsoft Corporation stated that its Bing search engine and Edge browser would be able to generate AI-powered images using the Bing Picture Maker, which is driven by the DALL-E model.
  • Adobe Inc., a provider of creative software, has announced that it will expand its own capabilities with an AI art generating product called Firefly.

With the debut of OpenAI LP’s DALL-E and Midjourney, which take text prompts and transform them into beautiful artwork, artificial intelligence art generators have become increasingly popular, and recently, two large businesses have joined the party.

Microsoft Corporation stated that its Bing search engine and Edge browser can generate AI-powered images using the Bing Picture Maker, which the DALL-E model drives. Adobe Inc., a creative software provider, has announced that it will expand its capabilities with an AI art-generating product called Firefly.

Those with access to the Bing chat preview will immediately access the new AI image generator in the “creative” mode. Users will be able to have the interface make artwork for them by just inputting a description of the image they desire, along with a flourishing example such as a location, object, or action, and then the underlying AI will develop something based on what it was taught.

Users may swiftly and simply input anything their imaginations conjure up, such as “make a picture” or “draw an image,” and the AI will do the rest. This includes iterating on an initial image by altering internal components, altering image elements, and letting the user alter the backdrop or other portions. Accessible Edge users who click the Bing Picture Creator icon in the sidebar or initiate Bing conversation in the browser can access the same functionality.

Microsoft emphasized that the AI image generator had controls to prevent the development of risky or damaging images and that it would block and alert users if they attempted to utilize prompts to make such images. Pictures made by the AI also have a watermark icon in the lower left corner to show that the Bing Image Creator developed them, but this may likely be clipped off.

Yusuf Mehdi, Corporate Vice President and Consumer Chief Marketing Officer at Microsoft, said, “With these updates and more coming, our goal is to deliver more immersive experiences in Bing and Edge that make finding answers and exploring the web more interesting, useful and fun.”

This functionality is being handed out immediately to those with access to the trial versions of the new Bing and Edge AI features and will be available to English-speaking users. Those who don’t have access to the new Bing and Edge capabilities can sign up for the waiting list, while those without access can test it immediately.

Adobe Announces Generative AI Tools with Firefly

Adobe refers to Firefly as “a family of generative AI models for creative expression” that it is adding to its applications to enable users to harness the power of AI art generation. These applications will initially include two tools: one that allows users to generate users and another that generates text effects.

The new AI art-generating tools will be immediately linked into Adobe’s existing portfolio of innovative cloud products, such as Creative Cloud, Document Cloud, Experience Cloud, and Adobe Express, enabling users to access these capabilities.

The current beta’s initial feature is like DALL-E or Midjourney’s text-to-image tool, which allows users to write a text prompt and create a sequence of pictures based on a written description. Like WordArt, “text effects” enable users to input text into the computer and then quickly change its appearance to “covered in snow” or “looks like it’s made of cake.” It will apply the specified style to the text on the screen.

Adobe has planned several future capabilities to enable customers to use textual descriptions to change or add to what they’ve previously developed in Adobe’s products. This is the potential of generative AI, which will expand the capabilities of Adobe’s already potent AI products built on Adobe Sensei.

For instance, users of Photoshop and Illustrator graphical editors can take images they are currently working on, choose portions of their digital artwork, and then have the AI adjust a part of the image based on context-aware cues. It could take a picture of a home on a beach, choose the house, type “house built of seashells,” and have the AI generate versions of the house. Alternatively, select the water and have the AI add ships, or choose the sky and create an alien fight.

It may also be used to generate new images based on the color schemes and styles of current material, making it easier to create comparable graphics. The same technique may also be used for film editing, as AI can alter the atmosphere and weather.

David Wadhwani, President of the digital media business at Adobe, said, “Generative AI is the next evolution of AI-driven creativity and productivity, transforming the conversation between creator and computer into something more natural, intuitive, and powerful. With Firefly, Adobe will bring generative AI-powered ‘creative ingredients’ directly into customers’ workflows.”

Adobe stated that the initial model is built on Adobe Stock photos, publicly licensed content, and photographs for which the copyright has expired to assuage the anxieties of creators afraid that Adobe may be utilizing their protected works to train the AI. The purpose is to ensure that all generated photographs are suitable for commercial usage.

Adobe also stated it is establishing a methodology akin to Adobe Stock to recompense creators whose artwork is used to train generative AI models, such as those used in Firefly. In addition, it is pioneering a worldwide standard in which creators may include a “Do Not Train” metadata tag to their artwork to instruct AI not to utilize it.

]]>
https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/microsoft-and-adobe-launch-ai-image-generators-as-competitiveness-heats-up/feed/ 0
Google’s Bard Chatbot Debuts in the U.S. and U.K. https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/googles-bard-chatbot-debuts-in-the-u-s-and-u-k/ https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/googles-bard-chatbot-debuts-in-the-u-s-and-u-k/#respond Wed, 22 Mar 2023 16:50:38 +0000 https://evaluatesolutions38.com/?p=51600 Highlights:

  • According to reports, Bard is faster than the competing chatbot in Microsoft Corporation’s Bing search engine.
  • Last month, Google CEO Sundar Pichai detailed that Bard will initially be powered by a lightweight version of Bard that is optimized for hardware performance.

Google LLC recently announced that its Bard artificial intelligence chatbot would be available to users in the United States and the United Kingdom.

Users can sign up for a waitlist to access Bard on a new webpage created by the search giant. According to The Verge, Google anticipates expanding the service to a wider audience to be “slow.” The company has yet to specify when Bard will be available to the general public.

After launch, Bard allows users to ask a limited number of questions per chat session, but there is no limit on the number of chat sessions initiated. The service generates up to three distinct draft responses to a query. Bard displays a “Google it” button beneath the drafts that launches Google’s search engine in a new tab.

According to reports, Bard is faster than the competing chatbot in Microsoft Corporation’s Bing search engine. It is believed that the faster response times are because Bard currently has fewer users. In addition, implementing the underlying large language model may influence the chatbot’s performance.

In response to a request for the biographies of a famous media house personnel, Bard replied, “I do not have enough information about that person to help with your request,” despite the abundance of information on Google itself. In contrast to ChatGPT, it did not provide several incorrect facts about a person.

Google introduced LaMDA, a large language model, for the first time in 2021. Last month, Google CEO Sundar Pichai said that Bard will initially be powered by a lightweight version of Bard that is optimized for hardware performance. Pichai explained that the version’s reduced hardware requirements would make it easier for Google to make it widely available. In a recent blog post, Google’s Vice Presidents, Sissie Hsiao and Eli Collins, explained that Bard will be updated with “newer, more capable models over time.”

Google intends to enhance Bard in additional ways. The search giant plans to equip the chatbot with additional language support, the capacity to generate software code, and unspecified multimodal capabilities. A multimodal AI model is a neural network capable of processing text, images, and videos.

Hsiao and Collins wrote recently, “You can use Bard to boost your productivity, accelerate your ideas and fuel your curiosity. You might ask Bard to give you tips to reach your goal of reading more books this year, explain quantum physics in simple terms or spark your creativity by outlining a blog post. We’ve learned a lot so far by testing Bard, and the next critical step in improving it is to get feedback from more people.”

In conjunction with the launch of Bard, Google is integrating large language models into other product components.

The company announced last week that its Workspace productivity suite would soon include various new generative AI features. The capabilities will aid users in composing emails, analyzing data in spreadsheets, and developing presentations.

Google is also incorporating generative AI into its cloud platform. Vertex AI, the company’s suite of cloud services for constructing and deploying neural networks, will provide access to several large language models. The models are being released with the Generative AI App Builder, a new tool that will make it easier for customers to develop machine learning applications.

]]>
https://evaluatesolutions38.com/news/tech-news/artificial-intelligence-news/googles-bard-chatbot-debuts-in-the-u-s-and-u-k/feed/ 0
New Relic Launches Instant Observability to OpenAI’s GPT-4 https://evaluatesolutions38.com/news/tech-news/machine-learning-news/new-relic-launches-instant-observability-to-openais-gpt-4/ https://evaluatesolutions38.com/news/tech-news/machine-learning-news/new-relic-launches-instant-observability-to-openais-gpt-4/#respond Mon, 20 Mar 2023 13:31:34 +0000 https://evaluatesolutions38.com/?p=51543 Highlights:

  • Microsoft Corp. recently announced intentions to integrate ChatGPT with its Bing search engine and made a USD 10 billion investment in OpenAI.
  • New Relic’s software lets businesses keep tabs on their DevOps environments, applications, and supporting infrastructure.

New Relic Inc., a provider of observability tooling, recently announced the release of a ground-breaking machine learning operations feature that enables engineering teams to monitor the performance of OpenAI LLC’s GPT-4 efficiently.

A new OpenAI quickstart has been made publically available in the company’s Instant Observability collection of plugins. With only a few lines of code, organizations can monitor OpenAI completion queries while measuring both performance cost metrics in New Relic in real-time. According to the company, its mission is to assist businesses in maximizing the benefits of new artificial intelligence technologies, such as GPT-4, while maintaining a balance between business requirements and expenses.

Companies can purchase software from New Relic to monitor their DevOps environments, applications, and supporting infrastructure. Its Instant Observability product is a library with over 400 quickstarts or prebuilt bundles of observability tools created by specialists and reviewed by New Relic. They enable almost instantaneous observability for any support service or product and may be set up in a few clicks.

The hype surrounding the ChatGPT chatbot, which has shown the amazing powers of generative AI, has increased the need for GPT-4 observability. OpenAI’s ChatGPT can react to nearly any query in a human-like manner. It has shown to be so valuable and realistic — if occasionally incorrect — that Google LLC is gravely concerned it might undermine its monopoly over internet search.

Microsoft Corp. recently announced intentions to integrate ChatGPT with its Bing search engine and made a USD 10 billion investment in OpenAI. Meanwhile, dozens of other businesses have started utilizing OpenAI’s GPT-4 model to generate content and image development, customer care chatbots, and help desk problems.

With more businesses adopting GPT-4, there is an increasing demand for observability tools that may aid in cost- and usage optimization. New Relic is looking to capitalize on this need with a recent release. It states that users may automatically import a monitor module from the nr_openai_monitor package with just two lines of code, providing a basic dashboard displaying different GPT-4 performance parameters. Its capabilities include real-time cost tracking and analytics like average response times and other performance indicators around GPT-4 queries, which can assist teams in utilizing the service more efficiently.

Manav Khurana, chief growth officer and general manager of observability at New Relic, stated that it is an exciting moment for businesses utilizing GPT-4.

Manav Khurana stated, “Observability is a game changer when it comes to helping companies extract value from GPT-4. We are making it so that any engineer using GPT-4 can easily monitor their cost and performance with easy setup and at no cost. This aligns with our mission to put the power of observability into the hands of every engineer.”

]]>
https://evaluatesolutions38.com/news/tech-news/machine-learning-news/new-relic-launches-instant-observability-to-openais-gpt-4/feed/ 0
Microsoft Unveils Cutting-Edge Azure Instances with AI Optimization https://evaluatesolutions38.com/news/cloud-news/microsoft-unveils-cutting-edge-azure-instances-with-ai-optimization/ https://evaluatesolutions38.com/news/cloud-news/microsoft-unveils-cutting-edge-azure-instances-with-ai-optimization/#respond Tue, 14 Mar 2023 19:52:59 +0000 https://evaluatesolutions38.com/?p=51460 Highlights:

  • Microsoft is taking on this challenge by using its ten years of supercomputing expertise to support the largest AI training workloads.
  • Microsoft’s new ND H100 v5 instances use NVLink, an Nvidia technology, to connect the eight H100 chips to one another.

A new instance family created specifically to run artificial intelligence models has been added to Microsoft Corp.’s Azure cloud platform.

The ND H100 v5 series is the instance family that premiered recently.

A principal project manager at Azure’s high-performance computing and AI group, Matt Vegas, wrote in a blog post “Delivering on the promise of advanced AI for our customers requires supercomputing infrastructure, services, and expertise to address the exponentially increasing size and complexity of the latest models. At Microsoft, we are meeting this challenge by applying a decade of experience in supercomputing and supporting the largest AI training workloads.”

Eight H100 graphics processing units from Nvidia Corp. are present in each ND H100 v5 instance. The H100 is Nvidia’s most advanced data center GPU, which was released last March. Compared to the company’s previous flagship chip, it can train AI models nine times faster and operate them up to 30 times faster.

The 80 billion transistors of the H100 were created using a four-nanometer technique. The Transformer Engine, a specialized module included in it, is made to accelerate AI models built using the Transformer neural network architecture. Many cutting-edge AI models, such as the ChatGPT chatbot from OpenAI LLC, are powered by this architecture.

The H100 includes further upgrades from Nvidia as well. The chip has a built-in confidential computing function among its many other features. The capability can isolate an AI model to prevent requests for unauthorized access from the operating system and hypervisor on which it operates.

Advanced AI models are typically installed across numerous graphics cards. When used in this way, GPUs must communicate with one another often to coordinate their work. Companies frequently connect their GPUs using high-speed network connections to accelerate the data transfer between them.

NVLink, an Nvidia technology, is used to connect the eight H100 CPUs in Microsoft’s latest ND H100 v5 instances. Nvidia claims the technology is seven times faster than the well-known networking standard, PCIe 5.0. According to Microsoft, NVLink offers 3.6 terabits per second of bandwidth across the eight GPUs in its new instances.

NVSwitch, another Nvidia networking technology, is also supported by the instance series. NVSwitch links various GPU servers together, whereas NVLink links the GPUs inside a single server. This simplifies operating complicated AI models that must be deployed over numerous systems in a data center.

Microsoft’s ND H100 v5 instances pair Intel Corp. CPUs with H100 graphics cards. The 4th Gen Xeon Scalable Processor series from Intel is the source of the CPUs. The Sapphire Rapids chip series debuted in January.

A modified version of Intel’s 10-nanometer process serves as the foundation for Sapphire Rapids. Each CPU in the series has several onboard accelerators, computing units designed for particular tasks. Sapphire Rapids, according to Intel, offers up to 10 times more performance than its previous-generation silicon for some AI applications because of the integrated accelerators.

]]>
https://evaluatesolutions38.com/news/cloud-news/microsoft-unveils-cutting-edge-azure-instances-with-ai-optimization/feed/ 0