News Commentary | January 04, 2021
DarwinAI attempts to illuminate how neural networks reach their decisions through its explainable AI (XAI) platform, GenSynth, which can be used to create custom deep learning models for applications including automated inspection and predictive maintenance. Unlike traditional AI platforms, GenSynth... Not part of subscription
News Commentary | March 22, 2021
Adhering to compliance with privacy regulations like GDPR and CCPA is becoming cumbersome and expensive for companies. This has created a demand for data management tools like DataGrail's integrated solution, which has more than 900 integrations into different apps and infrastructure platforms. ... Not part of subscription
News Commentary | April 05, 2021
UptimeAI claims differentiation in the ability of its predictive maintenance (PdM) software to reduce deep learning models by 10× so that it can analyze multiple assets simultaneously based on equipment‑process correlations. On paper, this sounds like a differentiating feature. However, the company ... Not part of subscription
by Cole McCollum
Following several allegations of gender discrimination against women, the CEO of Goldman Sachs, the card's issuer, publicly defended the company's decision-making process, stating that "we have not and never will make decisions based on factors like gender." What Goldman Sachs failed to take into account is that machine learning algorithms excel at finding latent features in data – features that are not directly used in training a machine learning model but are inferred from other features that are. Clients should be aware that propagating bias found in historical datasets is one of the biggest challenges in implementing machine learning and should explore bias detection and AI explainability tools that can help alleviate this issue.
For the original news article, click here .