Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature 1.13 llm update #5063

Merged
merged 9 commits into from
Aug 23, 2023
Merged

Feature 1.13 llm update #5063

merged 9 commits into from
Aug 23, 2023

Conversation

talkingwallace
Copy link
Contributor

Add support for FATE-LLM 1.3:
update nn framework
fix log & bug in nn

Signed-off-by: weijingchen <talkingwallace@sohu.com>
support ipr
fix typo

Signed-off-by: weijingchen <talkingwallace@sohu.com>
1. Update homonn framework, support arbiter side model
2. Fix bug & log format
3. Update aggregator framework

Signed-off-by: weijingchen <talkingwallace@sohu.com>
Signed-off-by: weijingchen <talkingwallace@sohu.com>
Signed-off-by: weijingchen <talkingwallace@sohu.com>
python/fate_client/pipeline/component/homo_nn.py Outdated Show resolved Hide resolved
python/fate_client/pipeline/component/homo_nn.py Outdated Show resolved Hide resolved
Comment on lines 103 to 104
self._updated = {'trainer': False, 'dataset': False,
'torch_seed': False, 'loss': False, 'optimizer': False, 'model': False}
'torch_seed': False, 'loss': False, 'optimizer': False, 'model': False, 'ds_config': False, 'server_init': False}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change
self._updated = {'trainer': False, 'dataset': False,
'torch_seed': False, 'loss': False, 'optimizer': False, 'model': False}
'torch_seed': False, 'loss': False, 'optimizer': False, 'model': False, 'ds_config': False, 'server_init': False}
self._updated = {
'trainer': False,
'dataset': False,
'torch_seed': False,
'loss': False,
'optimizer': False,
'model': False,
'ds_config': False,
'server_init': False}


def __init__(self) -> None:
super().__init__()

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change

@property
def embeded_param(self):
return None

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change


"""
Recursively replaces the LayerNorm layers of a given module with SignatureLayerNorm layers.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change

layer_name_set (set[str], optional): A set of layer names to be replaced. If None,
all LayerNorm layers in the module will be replaced.
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change

@property
def embeded_param(self):
return self.ln.weight

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change

return self.ln.weight

def embeded_param_num(self):
return self._embed_param_num
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change
return self._embed_param_num
return self._embed_param_num

conv = SignatureConv(3, 384, 3, 1, 1)
layer_norm = SignatureLayerNorm((768, ))
layer_norm_2 = SignatureLayerNorm.from_layer_norm_layer(layer_norm.ln)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change

talkingwallace and others added 3 commits August 22, 2023 10:42
Signed-off-by: weijingchen <talkingwallace@sohu.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Signed-off-by: Chen <cwjghglbdcj@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Signed-off-by: Chen <cwjghglbdcj@gmail.com>
@@ -82,7 +86,9 @@ def __init__(self,
loss=None,
optimizer: OptimizerType = None,
ds_config: dict = None,
model: Sequential = None, **kwargs):
model: Sequential = None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[autopep8] reported by reviewdog 🐶

Suggested change
model: Sequential = None,
model: Sequential = None,

Signed-off-by: weijingchen <talkingwallace@sohu.com>
@mgqa34 mgqa34 merged commit f1e60b9 into develop-1.11.3 Aug 23, 2023
2 checks passed
@sagewe sagewe mentioned this pull request Dec 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants