Skip to content

Commit

Permalink
Index page updates & finalize blog post
Browse files Browse the repository at this point in the history
  • Loading branch information
calpt committed Aug 10, 2024
1 parent b3907d5 commit 39073d6
Show file tree
Hide file tree
Showing 4 changed files with 56 additions and 43 deletions.
19 changes: 4 additions & 15 deletions app/templates/base.html
Original file line number Diff line number Diff line change
Expand Up @@ -114,22 +114,11 @@
<div class="container">
<p class="float-md-right text-center text-md-right">
<a href="https://arxiv.org/abs/2311.11077" target="_blank">Paper</a>
<span class="text-black-30 px-1">|</span>
<a href="{{ url_for('main.imprint_privacy') }}">Imprint & Privacy</a>
<!--<span class="text-black-30 px-1">|</span>
<a href="{{ url_for('main.imprint_privacy') }}">Imprint & Privacy</a>-->
</p>
<p class="text-muted text-center text-md-left">Brought to you with ❤️ &nbsp;by authors from:</p>
<div class="text-center text-md-left logos">
<div class="r1">
<img src="{{ url_for('static', filename='logos/ukp_logo.png') }}" class="logo logo-ukp">
<img src="{{ url_for('static', filename='logos/tu_logo_web.svg') }}" class="logo logo-tud">
<img src="{{ url_for('static', filename='logos/nyu_short_color.png') }}" class="logo logo-nyu">
</div>
<div class="r2">
<img src="{{ url_for('static', filename='logos/cambridge.png') }}" class="logo logo-cambridge">
<img src="{{ url_for('static', filename='logos/DeepMind_logo.png') }}" class="logo logo-deepmind">
</div>
</div>
</div>
<p class="text-muted text-center text-md-left">Brought to you with ❤️ by the AdapterHub Team</p>
</div>
</footer>

<script>
Expand Down
10 changes: 0 additions & 10 deletions app/templates/blog_post.html
Original file line number Diff line number Diff line change
Expand Up @@ -69,16 +69,6 @@ <h1>{{ post['title'] }}</h1>
</div>
</div>

<div class="blog-share col-lg-2 mt-2">
<div class="share-sheet px-3 pt-2 pb-3">
<div class="share-header text-muted font-weight-bold text-center text-uppercase mb-1">
Share
</div>
<a href="https://twitter.com/intent/tweet?text={{ post.title+' - '+config.FREEZER_BASE_URL+post.path }}">
<i class="fab fa-twitter"></i>
</a>
</div>
</div>
</div>

{% endblock %}
63 changes: 48 additions & 15 deletions app/templates/index.html
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{% extends 'base.html' %}

{% block title %} {{ n_adapters }} adapters for {{ n_subtasks }} text tasks and {{ n_languages }} languages {% endblock %}
{% block title %} Home of Adapters, the library for parameter-efficient and module fine-tuning {% endblock %}

{% block banner %}
<div class="container">
Expand All @@ -14,17 +14,26 @@
<div class="row">
<div class="text-light col-md-9">
<p class="highlight-text my-0">
A <span class="highlight">central repository</span>
for pre-trained <span class="highlight">adapter modules</span>
Home of <span class="highlight"><i>Adapters</i></span>, the library
for <br><span class="highlight">parameter-efficient</span> and <span class="highlight">modular</span> fine-tuning
<!-- A <span class="highlight">central repository</span>
for pre-trained <span class="highlight">adapter modules</span> -->
</p>
<p class="mt-2">
<!-- <p class="mt-2">
<span class="badge badge-light p-1 py-md-2 mr-md-1 px-md-3 text-black-65">{{ n_adapters }} <span class=" font-weight-normal">adapters</span></span>
<span class="badge badge-light p-1 py-md-2 mr-md-1 px-md-3 text-black-65">{{ n_subtasks }} <span class=" font-weight-normal">text tasks</span></span>
<span class="badge badge-light p-1 py-md-2 px-md-3 text-black-65">{{ n_languages }} <span class=" font-weight-normal">languages</span></span>
</p>
</p> -->
<pre class="text-white py-2 px-3 my-4 d-none d-md-block">pip install adapters</pre>
<div id="IndexButtonRow" class="rounded mt-lg-1">
<a class="btn"
href="{{ url_for('main.blog') }}">
<div>
<i class="fas fa-bullhorn"></i>
</div>
Blog
</a>
<a class="btn d-none d-sm-inline-block"
href="{{ url_for('main.explore_tasks') }}">
<div>
<i class="fas fa-binoculars"></i>
Expand All @@ -38,13 +47,6 @@
</div>
Docs
</a>
<a class="btn d-none d-sm-inline-block"
href="{{ url_for('main.blog') }}">
<div>
<i class="fas fa-bullhorn"></i>
</div>
Blog
</a>
<a class="btn d-none d-lg-inline-block"
href="https://github.com/Adapter-Hub/adapters">
<div>
Expand Down Expand Up @@ -226,7 +228,19 @@ <h1>Citation 📝</h1>
<p>
If you use the <i>Adapters</i> library in your work, please consider citing our library paper: <a href="https://arxiv.org/abs/2311.11077"> Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning </a>
</p>
<pre class="p-4 code">
<div id="accordion" class="mb-3">
<div class="card">
<div class="card-header" id="headingOne">
<h5 class="mb-0">
<button class="btn btn-link" data-toggle="collapse" data-target="#collapseOne" aria-expanded="true" aria-controls="collapseOne">
Poth et al. “Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning.” EMNLP (2023).
</button>
</h5>
</div>

<div id="collapseOne" class="collapse" aria-labelledby="headingOne" data-parent="#accordion">
<div class="card-body">
<pre class="p-4 code small">
@inproceedings{poth-etal-2023-adapters,
title = "Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning",
author = {Poth, Clifton and
Expand All @@ -247,12 +261,27 @@ <h1>Citation 📝</h1>
url = "https://aclanthology.org/2023.emnlp-demo.13",
pages = "149--160",
}</pre>
</div>
</div>
</div>
</div>

<p>
Alternatively, for the Hub infrastructure and adapters uploaded by the AdapterHub team, please consider citing our initial paper: <a href="https://arxiv.org/abs/2007.07779"> AdapterHub: A Framework for Adapting Transformers </a>
</p>

<pre class="p-4 code">
<div id="accordion" class="mb-3">
<div class="card">
<div class="card-header" id="headingTwo">
<h5 class="mb-0">
<button class="btn btn-link" data-toggle="collapse" data-target="#collapseTwo" aria-expanded="true" aria-controls="collapseTwo">
Pfeiffer et al. “AdapterHub: A Framework for Adapting Transformers.” EMNLP (2020).
</button>
</h5>
</div>

<div id="collapseTwo" class="collapse" aria-labelledby="headingTwo" data-parent="#accordion">
<div class="card-body">
<pre class="p-4 code small">
@inproceedings{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Jonas Pfeiffer and
Expand All @@ -270,6 +299,10 @@ <h1>Citation 📝</h1>
url = "https://www.aclweb.org/anthology/2020.emnlp-demos.7",
pages = "46--54",
}</pre>
</div>
</div>
</div>
</div>
</section>


Expand Down
7 changes: 4 additions & 3 deletions posts/2024/08/adapters-update-reft-qlora-merging-models.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Adapters Updates: ReFT, QLoRA, Merging, New Models & Hub"
date: 2024-08-06
title: "Adapters Library Updates: ReFT, QLoRA, Merging, New Models & Hub"
date: 2024-08-10
authors:
- name: Clifton Poth
twitter: "@clifapt"
Expand All @@ -16,7 +16,7 @@ summary: |
Today we are releasing the newest updates in our Adapters library. This post summarizes new features in the latest release as well as selected new features since our initial release in Nov 2023, including new adapter methods, new supported models and Hub updates.
---

Eight months ago, [we released _Adapters_](https://adapterhub.ml/blog/2023/11/introducing-adapters/), our new unified library for parameter-efficient and modular fine-tuning.
Nine months ago, [we released _Adapters_](https://adapterhub.ml/blog/2023/11/introducing-adapters/), our new unified library for parameter-efficient and modular fine-tuning.
_Adapters_ stands in direct tradition to our work on `adapter-transformers` since 2020, the first open-source library for parameter-efficient fine-tuning.
Since its initial release, _Adapters_ has received various updates, the newest being released today.
In this post, we'll go through some of the most exciting new features released today and in the last few months.
Expand Down Expand Up @@ -99,6 +99,7 @@ model.train_adapter("loreft_adapter")
Learn more about training adapters [in this notebook](https://github.com/adapter-hub/adapters/blob/main/notebooks/01_Adapter_Training.ipynb).

## Adapter Merging

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/06_Task_Arithmetics.ipynb)

We've expanded support for adapter merging, enabling the efficient combination of trained adapters without additional fine-tuning. Merging multiple adapters into a new one allows for efficient domain, language and task transfer. Adapter Merging is a form of Task Arithmetics ([Ilharco et al., 2023](https://arxiv.org/abs/2212.04089); [Zhang et al., 2023](https://proceedings.neurips.cc/paper_files/paper/2023/hash/299a08ee712d4752c890938da99a77c6-Abstract-Conference.html)) and hence also allows increasing or unlearning specific skills. All adapter methods support linear merging. For *N* adapters with parameters $\Phi_i$ the merged adapter parameters $\Phi_{merged}$ are calculated as:
Expand Down

0 comments on commit 39073d6

Please sign in to comment.