Zai commited on
Commit
a88d601
1 Parent(s): 02fa587

continueing data prep pipelien

Browse files
.ipynb_checkpoints/CONTRIBUTING-checkpoint.md ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Contributing to headshot
2
+
3
+ Thank you for considering contributing to headshot! Please take a moment to review the following guidelines.
4
+
5
+ ## Code of Conduct
6
+
7
+ This project and everyone participating in it are governed by the [Code of Conduct](CODE_OF_CONDUCT.md). By participating, you agree to uphold this code. Please report unacceptable behavior to [your email or a dedicated email for issues].
8
+
9
+ ## How to Contribute
10
+
11
+ 1. Fork the repository.
12
+
13
+ 2. Clone the forked repository to your local machine:
14
+
15
+ ```bash
16
+ git clone https://github.com/zaibutcooler/headshot.git
17
+ ```
18
+
19
+ 3. Create a new branch for your feature or bug fix:
20
+
21
+ ```bash
22
+ git checkout -b feature-name
23
+ ```
24
+
25
+ 4. Make your changes and commit them with a descriptive commit message:
26
+
27
+ ```bash
28
+ git add .
29
+ git commit -m "Add your descriptive message here"
30
+ ```
31
+
32
+ 5. Push the changes to your fork:
33
+
34
+ ```bash
35
+ git push origin feature-name
36
+ ```
37
+
38
+ 6. Create a pull request (PR) from your fork to the main repository.
39
+
40
+ 7. Ensure your PR title and description are clear and concise.
41
+
42
+ ## Reporting Issues
43
+
44
+ If you find any issues or have suggestions, please open an issue on the [Issue Tracker](https://github.com/zaibutcooler/headshot/issues).
45
+
46
+ ## Style Guide
47
+
48
+ - Follow the existing coding style.
49
+ - Use meaningful variable and function names.
50
+ - Write clear and concise documentation.
51
+
52
+ ## License
53
+
54
+ By contributing, you agree that your contributions will be licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
55
+
56
+ Thank you for contributing to headshot!
.ipynb_checkpoints/training-checkpoint.py ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ num_epochs = 30
2
+ out_dir = 'out'
headshot/.ipynb_checkpoints/data_prep-checkpoint.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import torch.nn as nn
3
+ import torch.optim as optim
4
+ from torch.utils.data import DataLoader, Dataset
5
+ from torchvision import transforms, datasets
6
+
7
+ data_url = ''
8
+
9
+
10
+ class FaceDataset(Dataset):
11
+ def __init__(self,data,labels,transforms=None):
12
+ self.tranforms = tranforms
13
+ self.data = x_data
14
+ self.labels = y_labels
15
+
16
+ def __len__(self):
17
+ return len(self.data)
18
+
19
+ def __getitem__(self, idx):
20
+ # Load and preprocess the image at the given index
21
+ image = self.data[idx]
22
+ label = self.labels[idx]
23
+
24
+ if self.transform:
25
+ image = self.transform(image)
26
+ return image,label
headshot/.ipynb_checkpoints/headshot-checkpoint.py ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ from torch import nn
3
+
4
+ class Headshot(nn.Module):
5
+ def __init__(self):
6
+ super().__init__()
7
+
8
+ def forward(self,x):
9
+ pass
headshot/.ipynb_checkpoints/robot-checkpoint.py ADDED
@@ -0,0 +1 @@
 
 
1
+ # to connect with some physical machine
headshot/.ipynb_checkpoints/utils-checkpoint.py ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ import matplotlib.pyplot as plt
2
+
3
+ def display_img(image):
4
+ pass
headshot/data_prep.py CHANGED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import torch
2
+ import torch.nn as nn
3
+ import torch.optim as optim
4
+ from torch.utils.data import DataLoader, Dataset
5
+ from torchvision import transforms, datasets
6
+
7
+ data_url = ''
8
+
9
+
10
+ class FaceDataset(Dataset):
11
+ def __init__(self,data,labels,transforms=None):
12
+ self.tranforms = tranforms
13
+ self.data = x_data
14
+ self.labels = y_labels
15
+
16
+ def __len__(self):
17
+ return len(self.data)
18
+
19
+ def __getitem__(self, idx):
20
+ # Load and preprocess the image at the given index
21
+ image = self.data[idx]
22
+ label = self.labels[idx]
23
+
24
+ if self.transform:
25
+ image = self.transform(image)
26
+ return image,label
headshot/robot.py ADDED
@@ -0,0 +1 @@
 
 
1
+ # to connect with some physical machine
headshot/utils.py CHANGED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ import matplotlib.pyplot as plt
2
+
3
+ def display_img(image):
4
+ pass
notebooks/.ipynb_checkpoints/detection_pytorch-checkpoint.ipynb CHANGED
@@ -154,9 +154,7 @@
154
  "metadata": {},
155
  "outputs": [],
156
  "source": [
157
- "# eval model\n",
158
- "def evaluate():\n",
159
- " pass"
160
  ]
161
  },
162
  {
 
154
  "metadata": {},
155
  "outputs": [],
156
  "source": [
157
+ "# eval model\n"
 
 
158
  ]
159
  },
160
  {
notebooks/detection_pytorch.ipynb CHANGED
@@ -154,9 +154,7 @@
154
  "metadata": {},
155
  "outputs": [],
156
  "source": [
157
- "# eval model\n",
158
- "def evaluate():\n",
159
- " pass"
160
  ]
161
  },
162
  {
 
154
  "metadata": {},
155
  "outputs": [],
156
  "source": [
157
+ "# eval model\n"
 
 
158
  ]
159
  },
160
  {
training.py CHANGED
@@ -1 +1,2 @@
1
- num_epochs=30
 
 
1
+ num_epochs = 30
2
+ out_dir = 'out'
untitled.txt ADDED
File without changes