-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Good First Issue][TF FE]: Support MatrixDiagV3 operation for TensorFlow #23248
Comments
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Hi @RaffaelloFornasiere, any update on this task? |
Hi @rkazants, thankyou for reaching out. I appreciate your patience regarding the task. I'm working on it, although I must admit I've encountered some challenges. I'm also managing this task alongside work commitments and my master's thesis. While I may not have full-time availability, I'm dedicated to completing it to the best of my ability. |
Hello @RaffaelloFornasiere, thanks for the update! Can we help you with any of these challenges? We're here to answer questions. |
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Hello @anzr299, are you still working on that issue? Do you need any help? |
Hi, I would like to unassign myself. I am focusing on a smaller subset of problems currently. Sorry for the trouble. |
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Context
OpenVINO component responsible for support of TensorFlow models is called as TensorFlow Frontend (TF FE). TF FE converts a model represented in TensorFlow opset to a model in OpenVINO opset.
In order to infer TensorFlow models with MatrixDiagV3 operation by OpenVINO, TF FE needs to be extended with this operation support.
What needs to be done?
For MatrixDiagV3 operation support, you need to implement the corresponding loader into TF FE op directory and to register it into the dictionary of Loaders. One loader is responsible for conversion (or decomposition) of one type of TensorFlow operation.
Here is an example of loader implementation for TensorFlow
Einsum
operation:In this example,
translate_einsum_op
converts TFEinsum
into OVEinsum
.NodeContext
object passed into the loader packs all info about inputs and attributes ofEinsum
operation. The loader retrieves an attribute of the equation by using theNodeContext::get_attribute()
method, prepares input vector, createsEinsum
operation from OV opset and returns a vector of outputs.Responsibility of a loader is to parse operation attributes, prepare inputs and express TF operation via OV operations sub-graph. Example for
Einsum
demonstrates the resulted sub-graph with one operation. In PR #19007 you can see operation decomposition into multiple node sub-graph.Once you are done with implementation of the translator, you need to implement the corresponding layer tests
test_tf_MatrixInverse.py
and put it into layer_tests/tensorflow_tests directory. Example how to run some layer test:Hint
Check how
MatrixBandPart
operation support was implemented here: #23082Example Pull Requests
Resources
Contact points
Ticket
No response
The text was updated successfully, but these errors were encountered: