Home Patent Forecast® Sectors Log In   Contact  
How it works Patent Forecast® Sectors Insights
Menu
Enjoy your FREE PREVIEW which shows only 2022 data and 25 documents. Contact Patent Forecast for full access.        

Edge Computing

Search All Patents in Edge Computing


Patent US10671434


Issued 2020-06-02

Storage Based Artificial Intelligence Infrastructure

Data transformation offloading in an artificial intelligence infrastructure that includes one or more storage systems and one or more graphical processing unit (‘GPU’) servers, including: storing, within the storage system, a dataset; identifying, in dependence upon one or more machine learning models to be executed on the GPU servers, one or more transformations to apply to the dataset; and generating, by the storage system in dependence upon the one or more transformations, a transformed dataset.



Much More than Average Length Specification


View the Patent Matrix® Diagram to Explore the Claim Relationships

USPTO Full Text Publication >

3 Independent Claims

  • 1. A method comprising: identifying, in dependence upon one or more machine learning models to be executed on one or more graphical processing unit (‘GPU’) servers, one or more transformations to apply to a dataset stored within one or more storage systems of a plurality of storage systems; scheduling, by a unified management plane, one or more transformations for one or more of the storage systems to apply to the dataset; and generating, by the storage system in dependence upon the one or more transformations, a transformed dataset.

  • 8. An artificial intelligence infrastructure configured to carry out the steps of: identifying, in dependence upon one or more machine learning models to be executed on one or more graphical processing unit (‘GPU’) servers, one or more transformations to apply to a dataset stored within one or more storage systems of a plurality of storage systems; scheduling, by a unified management plane, one or more transformations for one or more of the storage systems to apply to the dataset; and generating, by the storage system in dependence upon the one or more transformations, a transformed dataset.

  • 15. An apparatus comprising a computer processor, a computer memory operatively coupled to the computer processor, the computer memory having disposed within it computer program instructions that, when executed by the computer processor, cause the apparatus to carry out the steps of: identifying, in dependence upon one or more machine learning models to be executed on one or more graphical processing unit (‘GPU’) servers, one or more transformations to apply to a dataset stored within one or more storage systems of a plurality of storage systems; scheduling, by a unified management plane, one or more transformations for one or more of the storage systems to apply to the dataset; and generating, by the storage system in dependence upon the one or more transformations, a transformed dataset.