Home › Forums › DeepFaceLab › Errors › “Cannot create a tensor proto whose content is larger than 2GB”
- This topic has 0 replies, 1 voice, and was last updated 7 months, 2 weeks ago by vinceFix.
-
AuthorPosts
-
June 5, 2024 at 7:55 am #9616vinceFixParticipant
Hi,
I ran into this error while trying to export my SAEHD model to dfm:
Traceback (most recent call last):
File “D:\DFL\WF_640\_internal\DeepFaceLab\main.py”, line 416, in <module>
arguments.func(arguments)
File “D:\DFL\WF_640\_internal\DeepFaceLab\main.py”, line 193, in process_exportdfm
ExportDFM.main(model_class_name = arguments.model_name, saved_models_path = Path(arguments.model_dir))
File “D:\DFL\WF_640\_internal\DeepFaceLab\mainscripts\ExportDFM.py”, line 22, in main
model.export_dfm ()
File “D:\DFL\WF_640\_internal\DeepFaceLab\models\Model_SAEHD\Model.py”, line 1028, in export_dfm
[‘out_face_mask’,’out_celeb_face’,’out_celeb_face_mask’]
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\util\deprecation.py”, line 346, in new_func
return func(*args, **kwargs)
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\framework\graph_util_impl.py”, line 281, in convert_variables_to_constants
variable_names_denylist=variable_names_blacklist)
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\framework\convert_to_constants.py”, line 1282, in convert_variables_to_constants_from_session_graph
variable_names_denylist=variable_names_denylist))
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\framework\convert_to_constants.py”, line 1106, in _replace_variables_by_constants
None, tensor_data)
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\framework\convert_to_constants.py”, line 389, in convert_variable_to_constant
tensor_data.numpy.shape)
File “D:\DFL\WF_640\_internal\python-3.6.8\lib\site-packages\tensorflow\python\framework\tensor_util.py”, line 528, in make_tensor_proto
“Cannot create a tensor proto whose content is larger than 2GB.”)
ValueError: Cannot create a tensor proto whose content is larger than 2GB.For what I’ve read online, this is a common issue when using a large model (not just in the deepfake world but for every project using Tensorflow), and the solutions usually involve some code-tweaking that seem way over my Python level, or not having a model larger than 2GB.
I am surprised since I used the 640 pretrained model available here : https://www.deepfakevfx.com/pretrained-models-saehd/df-ud-wf-640-7394/
But I can’t find a similar topic to this in this forum.Does anyone already ran into a similar issue ?
Do I have to restart my project using a <2GB model ?Thx
-
AuthorPosts
- You must be logged in to reply to this topic.