Innolitics has deep expertise in image processing and algorithm development. Our algorithms have been deployed in several FDA cleared applications and we can build your algorithm as part of a new device or to augment an existing device. Our competencies include:
Natural language processing and textual data mining
Time series signal processing and anomaly detection
Hybrid traditional and neural network based algorithms
Hand-crafted heuristics based algorithms
Analyzing images and text contained in medical file formats such as DICOM, NIfTI, and HL7
Writing algorithm results in PDF encapsulated HL7, PDF encapsulated DICOM, DICOM structured reporting, and DICOM secondary capture
We provide the following services:
Engineering Consulting
It can be overwhelming to create an algorithm from scratch. We have been there. We can help you answer some common questions:
Design
What kind of algorithm should I use?
Is my problem suitable for machine learning?
Do I have enough data for a machine learning approach?
How should I annotate my data for a machine learning based approach?
Given a budget and problem, what kind of accuracy can I expect?
How can I decompose my problem into chunks that are most likely to be solvable?
Should I use deep learning or a more traditional machine learning approach?
Development
How much will it cost and how long will it take?
How do I source data? How about annotations?
Which data annotation tool should I use?
Should I train neural networks on the cloud or buy my own servers?
Deployment
How can I embed my algorithm into an existing product?
How do I deploy my algorithm in the cloud?
Regulatory Consulting
We are well versed with the FDA’s stance on AI and ML. We can help you design compliant SOPs for data collection, data annotation, algorithm design, development, verification and validation.
Here are some common questions we can answer:
Does my algorithm qualify as CADe (computer aided detection) or CADx (computer aided diagnosis)?
We can help you find data from a variety of places. Some common sources include:
Publicly available datasets such as the TCIA
Our multi-institutional data sourcing partners
Facilitate anonymization and cloud upload of data from your medical practice or university
Data Annotation
We can help you find qualified data annotation specialists for your data. Tap into our network of clinicians and technicians to annotate your data as cost effectively as possible.
We can also help you devise a data annotation strategy that minimizes the time that trained experts need to spend. For example, a technician can outline the border of a tumor—a time consuming task— but it takes a trained radiologist to find the correct slice. Our data annotation protocol could first involve trained radiologists to find the key slices and then hand it over to technicians to finish the rest.
End-to-End Algorithm Design and Development
You give us annotated data and we give you an algorithm that meets your accuracy and performance targets. Here are some specifics about our process:
Discovery phase: We work with you to gather requirements and understand the business value of the algorithm.
Prototype phase: We deliver the first version of the algorithm as fast as possible. This allows you the chance to verify we are headed in the direction you anticipate and allows us to prove—or disprove—feasibility early on in the process.
Iteration phase: Algorithms development can have twists and turns. Sometimes we find edge cases that require a change in direction. Sometimes we find certain types of data is under-represented in the dataset. Sometimes the algorithm finds human errors in the dataset—an encourging sign. We provide you with weekly updates about the state of development until the performance targets are met.
Design transfer: We can help integrate the algorithm into your product or work with your engineers to do so. Want us to develop your entire product and handle the 510(k) submission for you instead? Check out our end-to-end startup package.
We use the following tools in our toolbox to accomplish the above:
Deep neural network frameworks: TensorFlow and Pytorch
Other tools: Jupyter notebooks, Docker with GPU support, Tesseract OCR, 3D Slicer
FAQ
What if I am not finished annotating my data?
We can train the model as your team annotates data. You can see the
algorithm performance improve over time. We can repeat the annotation
and training cycle until we have met our performance targets.
How much does it cost to develop an algorithm?
There are several factors that contibute to the overall cost of an
algorithm. The two important ones are the inherent complexity of the
algorithm and the performance target. As a rule of thumb, you are in
good shape if a toddler can be taught to solve your problem in three
seconds or less. For example, a toddler need not have a medical degree
to properly segment a brain tumor. If this is not true, then try to
break up your problem into smaller three second chunks that a toddler
could solve. Each three second chunk will need to be developed as a
separate algorithm. The cost scales exponentially with each chunk. For
example, a simple task requiring just one chunk may cost $10k but a
complex task requiring five chunks could cost $1M. Complicated tasks
requiring an expert to combine multiple sources of knowledge and life
experiences is not solvable with the current state of AI.
As a rule of thumb, the cost of the algorithm increase exponentially
the more 9’s you add to the performance target. For example, if 90%
accuracy costs $10k, then 99% could cost $100k, and 99.9% could cost
$1M.
Cloud machine learning training is expensive. Do you provide an alternative?
Yes. We have powerful machine learning servers on site that we use
for your project—free of charge.
Hosting machine learning algorithms on the cloud is expensive. Do you provide an alternative?
For a small monthly fee, we can host your prototype for
non-production use cases such as investor demos, but we recommend moving
to the cloud once you get into the production phase.
When do you use a traditional algorithm vs a deep neural network?
Deep neural networks have disadvantages. They are difficult to debug,
fail in unpredictable ways, and can cause problems with FDA submissions.
Because of this, we prefer to use traditional algorithmic approaches
when possible. Some problems, especially image classification, are
trivial for the human eye to solve but difficult to describe in a way
that a computer can understand. A deep neural network is useful here
because we do not need to teach a computer anything, it just learns from
many examples. The disadvantage is we do not know exactly what the
computer is learning.
Do you have an algorithm design and development process?